đ€ AI Summary
Addressing the high computational cost of physics-based simulation and the inefficiency of existing deep generative models in protein multi-conformation generation, this work proposes the Structure Language Modeling (SLM) framework. SLM discretizes 3D protein structures into latent-space sequences via a variational autoencoder and employs a conditional language model to jointly capture sequenceâconformation distributions. Its core innovation is ESMDiffâa BERT-style structural language model leveraging ESM3 pre-trained representations and incorporating masked diffusion mechanisms. ESMDiff achieves both interpretability and efficient sampling. Evaluated on diverse tasksâincluding BPTI dynamics, conformational change pairs, and intrinsically disordered proteinsâESMDiff generates conformational ensembles 20â100Ă faster than state-of-the-art methods while maintaining structural fidelity. This substantial acceleration enables large-scale exploration of conformational heterogeneity, thereby advancing conformation-aware drug discovery.
đ Abstract
Proteins adopt multiple structural conformations to perform their diverse biological functions, and understanding these conformations is crucial for advancing drug discovery. Traditional physics-based simulation methods often struggle with sampling equilibrium conformations and are computationally expensive. Recently, deep generative models have shown promise in generating protein conformations as a more efficient alternative. However, these methods predominantly rely on the diffusion process within a 3D geometric space, which typically centers around the vicinity of metastable states and is often inefficient in terms of runtime. In this paper, we introduce Structure Language Modeling (SLM) as a novel framework for efficient protein conformation generation. Specifically, the protein structures are first encoded into a compact latent space using a discrete variational auto-encoder, followed by conditional language modeling that effectively captures sequence-specific conformation distributions. This enables a more efficient and interpretable exploration of diverse ensemble modes compared to existing methods. Based on this general framework, we instantiate SLM with various popular LM architectures as well as proposing the ESMDiff, a novel BERT-like structure language model fine-tuned from ESM3 with masked diffusion. We verify our approach in various scenarios, including the equilibrium dynamics of BPTI, conformational change pairs, and intrinsically disordered proteins. SLM provides a highly efficient solution, offering a 20-100x speedup than existing methods in generating diverse conformations, shedding light on promising avenues for future research.