🤖 AI Summary
In crystal structure prediction (CSP), existing diffusion models fail to simultaneously satisfy equivariance under periodic translations, rotations, and lattice permutations. To address this, we propose the first diffusion generative model that rigorously enforces all three geometric equivariances. Our method introduces a lattice-permutation-equivariant noise scheduling mechanism, ensuring strict periodic translational invariance throughout both training and inference. Furthermore, we integrate group-equivariant neural networks with a symmetry-aware diffusion framework to jointly model multi-type equivariances. Experiments demonstrate that our approach significantly outperforms state-of-the-art (SOTA) models in generation accuracy—measured by structural fidelity and validity—while achieving faster convergence. This work establishes a novel paradigm for equivariant generative modeling in crystalline materials science.
📝 Abstract
In addressing the challenge of Crystal Structure Prediction (CSP), symmetry-aware deep learning models, particularly diffusion models, have been extensively studied, which treat CSP as a conditional generation task. However, ensuring permutation, rotation, and periodic translation equivariance during diffusion process remains incompletely addressed. In this work, we propose EquiCSP, a novel equivariant diffusion-based generative model. We not only address the overlooked issue of lattice permutation equivariance in existing models, but also develop a unique noising algorithm that rigorously maintains periodic translation equivariance throughout both training and inference processes. Our experiments indicate that EquiCSP significantly surpasses existing models in terms of generating accurate structures and demonstrates faster convergence during the training process.