Tpgen: a language model for stable protein design with a specific topology structure

BMC Bioinformatics. 2024 Jan 23;25(1):35. doi: 10.1186/s12859-024-05637-5.

Abstract

Background: Natural proteins occupy a small portion of the protein sequence space, whereas artificial proteins can explore a wider range of possibilities within the sequence space. However, specific requirements may not be met when generating sequences blindly. Research indicates that small proteins have notable advantages, including high stability, accurate resolution prediction, and facile specificity modification.

Results: This study involves the construction of a neural network model named TopoProGenerator(TPGen) using a transformer decoder. The model is trained with sequences consisting of a maximum of 65 amino acids. The training process of TopoProGenerator incorporates reinforcement learning and adversarial learning, for fine-tuning. Additionally, it encompasses a stability predictive model trained with a dataset comprising over 200,000 sequences. The results demonstrate that TopoProGenerator is capable of designing stable small protein sequences with specified topology structures.

Conclusion: TPGen has the ability to generate protein sequences that fold into the specified topology, and the pretraining and fine-tuning methods proposed in this study can serve as a framework for designing various types of proteins.

Keywords: De novo protein design; LSTM; Neural network; Protein topologies; Transformer.

MeSH terms

  • Amino Acid Sequence
  • Amino Acids*
  • Electric Power Supplies*
  • Language
  • Learning

Substances

  • Amino Acids

Grants and funding