π€ AI Summary
Kernel Stein Discrepancy (KSD) lacks theoretical guarantees for moment convergence and is only valid under weak convergence, limiting its applicability in distribution approximation tasks requiring higher-order moment control.
Method: We propose Diffusion KSDβa novel discrepancy measure integrating the Stein operator, reproducing kernel Hilbert space (RKHS), and diffusion process theoryβto rigorously characterize *q*-Wasserstein convergence.
Contribution/Results: Diffusion KSD is the first KSD variant that simultaneously ensures weak convergence and convergence of all moments up to order *q*. We establish sufficient conditions under which KSD controls moment convergence, thereby extending the theoretical convergence boundary of classical KSD. The method provides a computationally tractable yet theoretically rigorous tool for MCMC diagnostic analysis and fitting unnormalized models. Its principled foundation in optimal transport and Stein methodology enables reliable assessment of distributional approximation quality, with broad implications for Bayesian inference, generative modeling, and variational inference.
π Abstract
Kernel Stein discrepancies (KSDs) measure the quality of a distributional approximation and can be computed even when the target density has an intractable normalizing constant. Notable applications include the diagnosis of approximate MCMC samplers and goodness-of-fit tests for unnormalized statistical models. The present work analyzes the convergence control properties of KSDs. We first show that standard KSDs used for weak convergence control fail to control moment convergence. To address this limitation, we next provide sufficient conditions under which alternative diffusion KSDs control both moment and weak convergence. As an immediate consequence we develop, for each $q>0$, the first KSDs known to exactly characterize $q$-Wasserstein convergence.