🤖 AI Summary
This work establishes Rosenthal- and Bernstein-type concentration inequalities for additive functionals of geometrically ergodic Markov chains, explicitly characterizing the dependence of deviation bounds on mixing time. Methodologically, it pioneers the extension of the classical Rosenthal inequality to the Markov-dependent setting via a novel analytical framework based on Poisson equation decomposition, which precisely links mixing constants, martingale Rosenthal constants, and deviation bounds. Integrating martingale techniques, geometric ergodicity analysis, and quantitative mixing time estimation, the approach yields computable, explicit, and tight upper bounds—significantly improving the polynomial dependence on mixing time present in prior results. The derived inequalities provide a rigorous theoretical foundation for error control in MCMC algorithms, sequential Monte Carlo estimation, and large-sample inference for non-i.i.d. statistics.
📝 Abstract
In this paper, we establish novel deviation bounds for additive functionals of geometrically ergodic Markov chains similar to Rosenthal and Bernstein inequalities for sums of independent random variables. We pay special attention to the dependence of our bounds on the mixing time of the corresponding chain. More precisely, we establish explicit bounds that are linked to the constants from the martingale version of the Rosenthal inequality, as well as the constants that characterize the mixing properties of the underlying Markov kernel. Finally, our proof technique is, up to our knowledge, new and based on a recurrent application of the Poisson decomposition.