🤖 AI Summary
This work addresses the problem of generating samples from a parametrically tilted version of an original distribution using denoising diffusion probabilistic models (DDPMs) under limited data availability. The authors propose a plug-in estimator that, for the first time, establishes minimax optimality for DDPMs in the context of tilted distributions and provides non-asymptotic convergence guarantees in both Wasserstein and total variation distances. Theoretical analysis demonstrates the estimator’s optimality, while extensive simulations confirm its high accuracy in sample generation across varying tilting parameters and sample sizes.
📝 Abstract
Given $n$ independent samples from a $d$-dimensional probability distribution, our aim is to generate diffusion-based samples from a distribution obtained by tilting the original, where the degree of tilt is parametrized by $θ\in \mathbb{R}^d$. We define a plug-in estimator and show that it is minimax-optimal. We develop Wasserstein bounds between the distribution of the plug-in estimator and the true distribution as a function of $n$ and $θ$, illustrating regimes where the output and the desired true distribution are close. Further, under some assumptions, we prove the TV-accuracy of running Diffusion on these tilted samples. Our theoretical results are supported by extensive simulations. Applications of our work include finance, weather and climate modelling, and many other domains, where the aim may be to generate samples from a tilted distribution that satisfies practically motivated moment constraints.