🤖 AI Summary
This paper addresses consistent estimation and inference for nonparametric blockwise M-estimators. It establishes a unified theoretical framework accommodating both convex and nonconvex objective functions, diverse loss families—including $L_p$, logistic, quantile, and distribution regression losses—and general non-identity link functions. Methodologically, it achieves, for the first time, joint consistent estimation of both the evaluation point and the loss-function indexing parameter; derives the optimal uniform Bahadur representation with sharp convergence rates; and develops a feasible strong approximation technique coupled with unified inference procedures—integrating blockwise estimation, uniform empirical process theory, Bahadur expansions, and functional transformation extensions. Under weaker regularity conditions than prior work, the approach attains optimal (and in some cases, information-bound achieving) estimation accuracy and valid inference, substantially advancing the state of the art. The results provide a rigorous, broadly applicable, and computationally implementable statistical foundation for quantile regression, distribution regression, and related nonparametric M-estimation problems.
📝 Abstract
This paper presents uniform estimation and inference theory for a large class of nonparametric partitioning-based M-estimators. The main theoretical results include: (i) uniform consistency for convex and non-convex objective functions; (ii) optimal uniform Bahadur representations; (iii) optimal uniform (and mean square) convergence rates; (iv) valid strong approximations and feasible uniform inference methods; and (v) extensions to functional transformations of underlying estimators. Uniformity is established over both the evaluation point of the nonparametric functional parameter and a Euclidean parameter indexing the class of loss functions. The results also account explicitly for the smoothness degree of the loss function (if any), and allow for a possibly non-identity (inverse) link function. We illustrate the main theoretical and methodological results with four substantive applications: quantile regression, distribution regression, $L_p$ regression, and Logistic regression; many other possibly non-smooth, nonlinear, generalized, robust M-estimation settings are covered by our theoretical results. We provide detailed comparisons with the existing literature and demonstrate substantive improvements: we achieve the best (in some cases optimal) known results under improved (in some cases minimal) requirements in terms of regularity conditions and side rate restrictions. The supplemental appendix reports other technical results that may be of independent interest.