Predicting Molecular Ground-State Conformation via Conformation Optimization

📅 2024-10-13
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Predicting molecular ground-state conformations is critical for drug design and other chemical applications; however, traditional energy-optimization methods are computationally expensive, while existing deep learning approaches compromise accuracy and interpretability. To address this, we propose WGFormer—a novel autoencoding architecture that integrates Wasserstein gradient flows with an SE(3)-equivariant Transformer. This work is the first to embed Wasserstein gradient flow theory into an SE(3)-equivariant neural network, enabling implicit learning of physically grounded energy-minimization dynamical trajectories. Coupled with latent-space atomic mixture modeling and an MLP-based decoder, WGFormer achieves efficient, high-accuracy, and interpretable conformational optimization. On multiple benchmark datasets, WGFormer significantly outperforms state-of-the-art methods, achieving lower RMSD errors and stronger correlation with quantum-mechanical energies—demonstrating superior physical consistency and generalization capability.

Technology Category

Application Category

📝 Abstract
Predicting molecular ground-state conformation (i.e., energy-minimized conformation) is crucial for many chemical applications such as molecular docking and property prediction. Classic energy-based simulation is time-consuming when solving this problem while existing learning-based methods have advantages in computational efficiency but sacrifice accuracy and interpretability. In this work, we propose a novel and effective method to bridge the energy-based simulation and the learning-based strategy, which designs and learns a Wasserstein gradient flow-driven SE(3)-Transformer, called WGFormer, for molecular ground-state conformation prediction. Specifically, our method tackles this task within an auto-encoding framework, which encodes low-quality conformations by the proposed WGFormer and decodes corresponding ground-state conformations by an MLP. The architecture of WGFormer corresponds to Wasserstein gradient flows -- it optimizes molecular conformations by minimizing an energy function defined on the latent mixture models of atoms, thereby significantly improving performance and interpretability. Extensive experiments show that our method consistently outperforms state-of-the-art competitors, providing a new and insightful paradigm to predict molecular ground-state conformation.
Problem

Research questions and friction points this paper is trying to address.

Predict molecular ground-state conformation accurately.
Bridge energy-based and learning-based methods effectively.
Enhance computational efficiency and interpretability simultaneously.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Wasserstein gradient flow-driven SE(3)-Transformer
Auto-encoding framework for conformation prediction
MLP decodes ground-state conformations
🔎 Similar Papers
No similar papers found.
Fanmeng Wang
Fanmeng Wang
Renmin University of China
geometric deep learninggraph representation learningAI4Science
M
Minjie Cheng
Gaoling School of Artificial Intelligence, Renmin University of China
H
Hongteng Xu
Gaoling School of Artificial Intelligence, Renmin University of China, Beijing Key Laboratory of Big Data Management and Analysis Methods