Smoothed SGD for quantiles: Bahadur representation and Gaussian approximation

📅 2025-05-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the quantile crossing problem—non-monotonicity of estimated quantile curves across quantile levels—in quantile regression. We propose a smoothed stochastic gradient descent (SGD) algorithm that enforces strict monotonicity by smoothing the score function. Theoretically, we establish the first non-asymptotic tail probability bound with explicit convergence rate and a uniform Bahadur representation for quantile estimators; further, we develop a uniform Gaussian approximation with Polyak–Ruppert averaging, breaking free from classical asymptotic frameworks. Methodologically, our approach unifies monotonicity guarantees with non-asymptotic statistical characterization. Numerical experiments demonstrate that the algorithm eliminates quantile crossing in finite samples, yields controllable estimation error, and exhibits robust performance. Altogether, it provides a new paradigm for online quantile learning that is both theoretically rigorous and computationally feasible.

Technology Category

Application Category

📝 Abstract
This paper considers the estimation of quantiles via a smoothed version of the stochastic gradient descent (SGD) algorithm. By smoothing the score function in the conventional SGD quantile algorithm, we achieve monotonicity in the quantile level in that the estimated quantile curves do not cross. We derive non-asymptotic tail probability bounds for the smoothed SGD quantile estimate both for the case with and without Polyak-Ruppert averaging. For the latter, we also provide a uniform Bahadur representation and a resulting Gaussian approximation result. Numerical studies show good finite sample behavior for our theoretical results.
Problem

Research questions and friction points this paper is trying to address.

Estimating quantiles using smoothed SGD algorithm
Ensuring non-crossing quantile curves via monotonicity
Deriving non-asymptotic bounds and Gaussian approximation results
Innovation

Methods, ideas, or system contributions that make the work stand out.

Smoothed SGD for quantile estimation
Monotonic quantile curves via score smoothing
Bahadur representation and Gaussian approximation
🔎 Similar Papers
No similar papers found.
L
Likai Chen
Department of Mathematics and Statistics, Washington University in St.Louis
Georg Keilbar
Georg Keilbar
Humboldt-Universität zu Berlin
EconometricsStatistics
Wei Biao Wu
Wei Biao Wu
University of Chicago
statistics