Prot42: a Novel Family of Protein Language Models for Target-aware Protein Binder Generation

📅 2025-04-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Designing high-affinity protein binders remains challenging when target 3D structures and binding sites are unknown. To address this, we propose the first purely sequence-driven, target-aware de novo protein binder generation framework. Our method introduces an autoregressive protein language model (pLM) decoder architecture supporting ultra-long sequences of up to 8,192 residues. Trained on large-scale unlabeled protein sequences, it implicitly integrates evolutionary, structural, and functional information—requiring no target structural input. Crucially, it enables target-sequence-conditioned generation, breaking from conventional structure-dependent paradigms. On high-affinity protein binder design and sequence-specific DNA-binding protein generation, our approach significantly outperforms structure-based baselines including AlphaProteo. The code and models are publicly released to facilitate reproducible, accessible protein engineering.

Technology Category

Application Category

📝 Abstract
Unlocking the next generation of biotechnology and therapeutic innovation demands overcoming the inherent complexity and resource-intensity of conventional protein engineering methods. Recent GenAI-powered computational techniques often rely on the availability of the target protein's 3D structures and specific binding sites to generate high-affinity binders, constraints exhibited by models such as AlphaProteo and RFdiffusion. In this work, we explore the use of Protein Language Models (pLMs) for high-affinity binder generation. We introduce Prot42, a novel family of Protein Language Models (pLMs) pretrained on vast amounts of unlabeled protein sequences. By capturing deep evolutionary, structural, and functional insights through an advanced auto-regressive, decoder-only architecture inspired by breakthroughs in natural language processing, Prot42 dramatically expands the capabilities of computational protein design based on language only. Remarkably, our models handle sequences up to 8,192 amino acids, significantly surpassing standard limitations and enabling precise modeling of large proteins and complex multi-domain sequences. Demonstrating powerful practical applications, Prot42 excels in generating high-affinity protein binders and sequence-specific DNA-binding proteins. Our innovative models are publicly available, offering the scientific community an efficient and precise computational toolkit for rapid protein engineering.
Problem

Research questions and friction points this paper is trying to address.

Overcoming complexity in protein engineering methods
Generating high-affinity binders without 3D structures
Enabling large-scale protein sequence modeling
Innovation

Methods, ideas, or system contributions that make the work stand out.

Prot42: novel protein language models family
Advanced auto-regressive decoder-only architecture
Handles sequences up to 8,192 amino acids
🔎 Similar Papers
No similar papers found.
M
Mohammad Amaan Sayeed
Inception Institute of Artificial Intelligence, Abu Dhabi, UAE.
E
Engin Tekin
Cerebras Systems, Sunnyvale, CA, USA.
M
Maryam Nadeem
Inception Institute of Artificial Intelligence, Abu Dhabi, UAE.
N
Nancy A. ElNaker
Inception Institute of Artificial Intelligence, Abu Dhabi, UAE.
A
Aahan Singh
Inception Institute of Artificial Intelligence, Abu Dhabi, UAE.
Natalia Vassilieva
Natalia Vassilieva
Sr. Director of Product, Cerebras Systems
image analysisinformation retrievalinformatin extractionmachine learningnatural language processing
B
Boulbaba Ben Amor
Inception Institute of Artificial Intelligence, Abu Dhabi, UAE.