Scalable Boltzmann Generators for equilibrium sampling of large-scale materials

📅 2025-09-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Low sampling efficiency, significant bias, and strong sample correlations plague equilibrium sampling in large-scale material systems. To address these challenges, this work introduces a scalable Boltzmann Generator architecture that integrates enhanced coupling flows with graph neural networks to enable local-environment-aware generative modeling; it further incorporates physics-informed training strategies and explicit energy-function constraints to ensure thermodynamic consistency. The method exhibits cross-size transferability, substantially accelerating training and boosting sampling throughput—yielding unbiased, low-autocorrelation configurations in a single generation step. Validated on systems including thousand-atom Lennard-Jones crystals, mW water ice, and silicon phase diagrams, it successfully reconstructs equilibrium ensembles, computes free energies with high accuracy, and effectively suppresses finite-size effects—thereby overcoming fundamental scalability and efficiency bottlenecks inherent in conventional molecular simulation.

Technology Category

Application Category

📝 Abstract
The use of generative models to sample equilibrium distributions of many-body systems, as first demonstrated by Boltzmann Generators, has attracted substantial interest due to their ability to produce unbiased and uncorrelated samples in `one shot'. Despite their promise and impressive results across the natural sciences, scaling these models to large systems remains a major challenge. In this work, we introduce a Boltzmann Generator architecture that addresses this scalability bottleneck with a focus on applications in materials science. We leverage augmented coupling flows in combination with graph neural networks to base the generation process on local environmental information, while allowing for energy-based training and fast inference. Compared to previous architectures, our model trains significantly faster, requires far less computational resources, and achieves superior sampling efficiencies. Crucially, the architecture is transferable to larger system sizes, which allows for the efficient sampling of materials with simulation cells of unprecedented size. We demonstrate the potential of our approach by applying it to several materials systems, including Lennard-Jones crystals, ice phases of mW water, and the phase diagram of silicon, for system sizes well above one thousand atoms. The trained Boltzmann Generators produce highly accurate equilibrium ensembles for various crystal structures, as well as Helmholtz and Gibbs free energies across a range of system sizes, able to reach scales where finite-size effects become negligible.
Problem

Research questions and friction points this paper is trying to address.

Scaling generative models for large-scale materials sampling
Addressing computational bottlenecks in Boltzmann Generator architectures
Achieving efficient equilibrium sampling for thousand-atom systems
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses augmented coupling flows with graph networks
Enables energy-based training and fast inference
Transfers to larger systems with reduced resources
🔎 Similar Papers
No similar papers found.
M
Maximilian Schebek
Fachbereich Physik, Freie Universität Berlin, 14195 Berlin, Germany
Jutta Rogal
Jutta Rogal
Flatiron Institute
enhanced samplingdimensionality reductionmachine learning for molecular physicsmaterials