🤖 AI Summary
This paper addresses non-asymptotic statistical inference for Polyak–Ruppert averaging in linear stochastic approximation (LSA) under Markovian noise. Existing theory lacks sharp normal approximation rates and valid confidence interval construction under dependent noise. To bridge this gap, we propose a novel inference framework based on the multiplier block bootstrap, establishing— for the first time—the non-asymptotic validity of bootstrap confidence intervals under Markovian noise. Theoretically, we derive a Berry–Esseen convergence rate of $O(n^{-1/4})$ in Kolmogorov distance and achieve an asymptotic variance estimation rate of $O(n^{-1/8}log^{1/2} n)$. These results significantly advance real-time, precise statistical inference for stochastic optimization algorithms operating on dependent data.
📝 Abstract
In this paper we derive non-asymptotic Berry-Esseen bounds for Polyak-Ruppert averaged iterates of the Linear Stochastic Approximation (LSA) algorithm driven by the Markovian noise. Our analysis yields $mathcal{O}(n^{-1/4})$ convergence rates to the Gaussian limit in the Kolmogorov distance. We further establish the non-asymptotic validity of a multiplier block bootstrap procedure for constructing the confidence intervals, guaranteeing consistent inference under Markovian sampling. Our work provides the first non-asymptotic guarantees on the rate of convergence of bootstrap-based confidence intervals for stochastic approximation with Markov noise. Moreover, we recover the classical rate of order $mathcal{O}(n^{-1/8})$ up to logarithmic factors for estimating the asymptotic variance of the iterates of the LSA algorithm.