🤖 AI Summary
This work addresses the low sampling efficiency of diffusion models under physical constraints. We propose a physics-aware consistency training framework that integrates consistency training with explicit physical priors—such as PDE conservation laws—into a two-stage learning paradigm: the first stage learns the noise-to-data mapping, while the second stage enforces physics-based regularization to ensure generated samples strictly satisfy the target partial differential equation (PDE). To our knowledge, this is the first method achieving 100% constraint satisfaction in single-step sampling. On toy PDE benchmarks, it accelerates sampling by over 100× compared to multi-step diffusion baselines. The framework establishes a new, verifiable, highly efficient, and physically consistent paradigm for generative PDE solving.
📝 Abstract
We propose a physics-aware Consistency Training (CT) method that accelerates sampling in Diffusion Models with physical constraints. Our approach leverages a two-stage strategy: (1) learning the noise-to-data mapping via CT, and (2) incorporating physics constraints as a regularizer. Experiments on toy examples show that our method generates samples in a single step while adhering to the imposed constraints. This approach has the potential to efficiently solve partial differential equations (PDEs) using deep generative modeling.