🤖 AI Summary
This paper addresses the challenge of analytically solving the score function in nonlinear diffusion generative models. We propose the first theoretical framework grounded in Malliavin calculus, introducing a novel Bismut-type formula that integrates the Skorokhod integral with first- and second-order variational processes. This yields an exact closed-form expression for the score function applicable to general nonlinear diffusion processes. Crucially, our approach completely eliminates reliance on conventional Malliavin derivatives, substantially enhancing both theoretical tractability and computational feasibility. The derived closed-form solution provides a rigorous foundation for efficient sampling algorithms under complex data distributions and naturally extends to broader classes of nonlinear stochastic differential equation (SDE) modeling scenarios.
📝 Abstract
Score-based diffusion generative models have recently emerged as a powerful tool for modelling complex data distributions. These models aim at learning the score function, which defines a map from a known probability distribution to the target data distribution via deterministic or stochastic differential equations (SDEs). The score function is typically estimated from data using a variety of approximation techniques, such as denoising or sliced score matching, Hyvärien's method, or Schrödinger bridges. In this paper, we derive an exact, closed form, expression for the score function for a broad class of nonlinear diffusion generative models. Our approach combines modern stochastic analysis tools such as Malliavin derivatives and their adjoint operators (Skorokhod integrals or Malliavin Divergence) with a new Bismut-type formula. The resulting expression for the score function can be written entirely in terms of the first and second variation processes, with all Malliavin derivatives systematically eliminated, thereby enhancing its practical applicability. The theoretical framework presented in this work offers a principled foundation for advancing score estimation methods in generative modelling, enabling the design of new sampling algorithms for complex probability distributions. Our results can be extended to broader classes of stochastic differential equations, opening new directions for the development of score-based diffusion generative models.