Convergence in total variation for the kinetic Langevin algorithm

📅 2024-07-12
🏛️ arXiv.org
📈 Citations: 3
Influential: 0
📄 PDF
🤖 AI Summary
This work establishes non-asymptotic total variation convergence bounds for the Kinetic Langevin Monte Carlo (KLMC) algorithm sampling from high-dimensional target distributions. Under the assumptions that the target measure satisfies a Poincaré inequality and the potential gradient is Lipschitz continuous, we derive the first dimensionally explicit convergence rate of order $O(sqrt{d})$, substantially improving upon Dalalyan’s $O(d)$ bound for the (non-kinetic) Langevin Monte Carlo algorithm (2017). Methodologically, our analysis integrates probabilistic coupling techniques, refined exploitation of the Poincaré inequality, and tight control of discretization errors in stochastic differential equation (SDE) approximation. Crucially, this is the first result to rigorously quantify how the kinetic mechanism—i.e., momentum incorporation—alleviates the curse of dimensionality. Our results demonstrate that momentum reduces the convergence complexity from linear to square-root dependence on dimension, providing foundational theoretical justification for momentum-based samplers in high-dimensional settings.

Technology Category

Application Category

📝 Abstract
We prove non asymptotic total variation estimates for the kinetic Langevin algorithm in high dimension when the target measure satisfies a Poincar'e inequality and has gradient Lipschitz potential. The main point is that the estimate improves significantly upon the corresponding bound for the non kinetic version of the algorithm, due to Dalalyan. In particular the dimension dependence drops from $O(n)$ to $O(sqrt n)$.
Problem

Research questions and friction points this paper is trying to address.

Proves non-asymptotic total variation estimates for kinetic Langevin algorithm.
Focuses on high-dimensional target measures with Poincaré inequality.
Improves dimension dependence from O(n) to O(sqrt(n)).
Innovation

Methods, ideas, or system contributions that make the work stand out.

Kinetic Langevin algorithm improves dimension dependence.
Non-asymptotic total variation estimates in high dimensions.
Target measure satisfies Poincaré inequality and Lipschitz potential.
🔎 Similar Papers
No similar papers found.
J
Joseph Lehec
Université de Poitiers, CNRS, LMA, Poitiers, France