🤖 AI Summary
This study investigates the statistical consistency and configuration recovery capability of classical multidimensional scaling (MDS) under generalized noise. Specifically, it addresses the setting where pairwise distances are corrupted by noise and the true low-dimensional configuration is unknown. Under the weak assumption that the noise has finite fourth-order moments—significantly milder than prior Gaussian or sub-Gaussian assumptions—we establish, for the first time, the minimax optimality of classical scaling and derive the matching convergence rate $O(n^{-1/2})$, together with a tight information-theoretic lower bound. Methodologically, the analysis integrates functional analysis, random matrix theory, empirical process techniques, and lower-bound construction. These results substantially broaden the theoretical applicability of classical MDS, providing rigorous statistical guarantees for its robust use in high-noise and low signal-to-noise ratio regimes.
📝 Abstract
We establish the consistency of classical scaling under a broad class of noise models, encompassing many commonly studied cases in literature. Our approach requires only finite fourth moments of the noise, significantly weakening standard assumptions. We derive convergence rates for classical scaling and establish matching minimax lower bounds, demonstrating that classical scaling achieves minimax optimality in recovering the true configuration even when the input dissimilarities are corrupted by noise.