MultiPUFFIN: A Multimodal Domain-Constrained Foundation Model for Molecular Property Prediction of Small Molecules

📅 2026-02-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the lack of thermodynamic consistency in existing molecular foundation models and the limitations of traditional methods, which are restricted to single-property prediction and rely on small datasets. The authors propose the first multimodal framework that jointly leverages SMILES strings, molecular graphs, and 3D geometric structures, embedding thermodynamic equations as inductive biases to enable consistent multi-task prediction of nine thermophysical properties. The model incorporates a gated cross-modal attention mechanism, domain-constrained prediction heads, and a two-stage training strategy, supporting inference with missing modalities and enabling unsupervised recovery of thermodynamic parameters. Evaluated on a test set of 8,877 molecules, the model achieves an average R² of 0.716, outperforming ChemBERTa-2 across all tasks with 2,000-fold less training data, particularly excelling in temperature-dependent property prediction.

Technology Category

Application Category

📝 Abstract
Predicting physicochemical properties across chemical space is vital for chemical engineering, drug discovery, and materials science. Current molecular foundation models lack thermodynamic consistency, while domain-informed approaches are limited to single properties and small datasets. We introduce MultiPUFFIN, a domain-constrained multimodal foundation model addressing both limitations simultaneously. MultiPUFFIN features: (i) an encoder fusing SMILES, graphs, and 3D geometries via gated cross-modal attention, alongside experimental condition and descriptor encoders; (ii) prediction heads embedding established correlations (e.g., Wagner, Andrade, van't Hoff, and Shomate equations) as inductive biases to ensure thermodynamic consistency; and (iii) a two-stage multi-task training strategy.Extending prior frameworks, MultiPUFFIN predicts nine thermophysical properties simultaneously. It is trained on a multi-source dataset of 37,968 unique molecules (40,904 rows). With roughly 35 million parameters, MultiPUFFIN achieves a mean $R^2 = 0.716$ on a challenging scaffold-split test set of 8,877 molecules. Compared to ChemBERTa-2 (pre-trained on 77 million molecules), MultiPUFFIN outperforms the fine-tuned baseline across all nine properties despite using 2000x fewer training molecules. Advantages are strikingly apparent for temperature-dependent properties, where ChemBERTa-2 lacks the architectural capacity to incorporate thermodynamic conditions.These results demonstrate that multimodal encoding and domain-informed biases substantially reduce data and compute requirements compared to brute-force pre-training. Furthermore, MultiPUFFIN handles missing modalities and recovers meaningful thermodynamic parameters without explicit supervision. Systematic ablation studies confirm the property-specific benefits of these domain-informed prediction heads.
Problem

Research questions and friction points this paper is trying to address.

molecular property prediction
thermodynamic consistency
multimodal foundation model
small molecules
domain-constrained modeling
Innovation

Methods, ideas, or system contributions that make the work stand out.

multimodal fusion
thermodynamic consistency
domain-informed inductive bias
molecular property prediction
cross-modal attention
🔎 Similar Papers
No similar papers found.
I
Idelfonso B. R. Nogueira
Department of Chemical Engineering, Norwegian University of Science and Technology (NTNU), Trondheim 7034, Norway
C
Carine M. Rebello
Department of Chemical Engineering, Norwegian University of Science and Technology (NTNU), Trondheim 7034, Norway
M
Mumin Enis Leblebici
Faculty of Industrial Engineering, KU Leuven, Diepenbeek, Belgium
Erick Giovani Sperandio Nascimento
Erick Giovani Sperandio Nascimento
University of Surrey ; SENAI CIMATEC University
Artificial IntelligenceComputational ModelingComputer ScienceEnvironmental EngineeringHigh