Locally minimax optimal and dimension-agnostic discrete argmin inference

📅 2025-03-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses confidence set inference for the index of the minimum component (argmin) of a high-dimensional mean vector μ. We propose the first locally minimax optimal and dimension-free argmin inference method applicable to arbitrary dimension d, unknown correlation structures, and heavy-tailed distributions—requiring only existence of second moments. Our approach combines sample splitting with self-normalization to construct robust test statistics, integrates a dual-testing framework, and employs robust reweighting to achieve non-asymptotic Type-I error control and optimal convergence rates. We establish, for the first time, the locally minimax separation rate for argmin inference and demonstrate its tight achievability under both equal-mean configurations and strong coordinate-wise dependence. Simulation studies and real-data analyses confirm that our method substantially improves statistical power while rigorously controlling error rates.

Technology Category

Application Category

📝 Abstract
This paper tackles a fundamental inference problem: given $n$ observations from a $d$ dimensional vector with unknown mean $oldsymbol{mu}$, we must form a confidence set for the index (or indices) corresponding to the smallest component of $oldsymbol{mu}$. By duality, we reduce this to testing, for each $r$ in $1,ldots,d$, whether $mu_r$ is the smallest. Based on the sample splitting and self-normalization approach of Kim and Ramdas (2024), we propose"dimension-agnostic"tests that maintain validity regardless of how $d$ scales with $n$, and regardless of arbitrary ties in $oldsymbol{mu}$. Notably, our validity holds under mild moment conditions, requiring little more than finiteness of a second moment, and permitting possibly strong dependence between coordinates. In addition, we establish the local minimax separation rate for this problem, which adapts to the cardinality of a confusion set, and show that the proposed tests attain this rate. Furthermore, we develop robust variants that continue to achieve the same minimax rate under heavy-tailed distributions with only finite second moments. Empirical results on simulated and real data illustrate the strong performance of our approach in terms of type I error control and power compared to existing methods.
Problem

Research questions and friction points this paper is trying to address.

Infer smallest component indices of a high-dimensional mean vector
Develop dimension-agnostic tests valid under arbitrary ties and dependence
Achieve minimax optimality under heavy-tailed distributions with finite moments
Innovation

Methods, ideas, or system contributions that make the work stand out.

Dimension-agnostic tests for discrete argmin inference
Local minimax separation rate adaptation
Robust variants for heavy-tailed distributions
🔎 Similar Papers
No similar papers found.