🤖 AI Summary
This work proposes a novel paradigm for generative diffusion models in infinite-dimensional spaces, circumventing the conventional reliance on time reversal. By rigorously incorporating Doob’s h-transform and exponential measure change, the method steers a reference diffusion process toward a target distribution without explicitly constructing the reverse dynamics of the noise process. It establishes, for the first time, a mathematically rigorous framework for generative modeling within infinite-dimensional stochastic analysis, coupled with score matching for parameter learning. Under verifiable conditions, the authors derive error bounds quantifying the approximation of the target measure and demonstrate the approach’s efficacy and scalability through experiments on both synthetic and real-world data.
📝 Abstract
This paper introduces a rigorous framework for defining generative diffusion models in infinite dimensions via Doob's h-transform. Rather than relying on time reversal of a noising process, a reference diffusion is forced towards the target distribution by an exponential change of measure. Compared to existing methodology, this approach readily generalises to the infinite-dimensional setting, hence offering greater flexibility in the diffusion model. The construction is derived rigorously under verifiable conditions, and bounds with respect to the target measure are established. We show that the forced process under the changed measure can be approximated by minimising a score-matching objective and validate our method on both synthetic and real data.