Precision Matrix Regularization in Sufficient Dimension Reduction for Improved Quadratic Discriminant Classification

📅 2025-06-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In high-dimensional, low-sample-size settings, unstable precision matrix estimation degrades Quadratic Discriminant Analysis (QDA) performance. To address this, we propose the Stabilized Sufficient Dimension Reduction (SSDR) framework. SSDR regularizes class-specific precision matrices within a sufficient dimension reduction paradigm, theoretically guaranteeing preservation of all classification information while accommodating diverse shrinkage estimators. This simultaneously enhances both dimension reduction stability and classification accuracy. Unlike conventional linear dimension reduction methods, SSDR retains interpretability while markedly improving robustness in low-sample-size regimes. Extensive Monte Carlo simulations and experiments on multiple real-world datasets demonstrate that SSDR achieves significantly higher classification accuracy than state-of-the-art dimension reduction methods across most high-dimensional, low-sample-size tasks—particularly when the sample size is substantially smaller than the feature dimension.

Technology Category

Application Category

📝 Abstract
Sufficient dimension reduction (SDR) methods, which often rely on class precision matrices, are widely used in supervised statistical classification problems. However, when class-specific sample sizes are small relative to the original feature-space dimension, precision matrix estimation becomes unstable and, as a result, increases the variability of the linear dimension reduction (LDR) matrix. Ultimately, this fact causes suboptimal supervised classification. To address this problem, we develop a multiclass and distribution-free SDR method, stabilized SDR (SSDR), that employs user-specified precision matrix shrinkage estimators to stabilize the LDR projection matrix and supervised classifier. We establish this technique with the theoretical guarantee of preserving all classification information under the quadratic discriminant analysis (QDA) decision rule. We evaluate multiple precision matrix shrinkage estimators within our proposed SSDR framework through Monte Carlo simulations and applications to real datasets. Our empirical results demonstrate the efficacy of the SSDR method, which generally improves classification accuracy and frequently outperforms several well-established competing SDR methods.
Problem

Research questions and friction points this paper is trying to address.

Stabilizing precision matrix estimation in small sample sizes
Reducing variability in linear dimension reduction matrices
Improving classification accuracy with stabilized SDR method
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses precision matrix shrinkage estimators
Stabilizes LDR projection matrix
Improves quadratic discriminant classification
🔎 Similar Papers
No similar papers found.
D
Derik T. Boonstra
Department of Statistical Science, Baylor University, Waco, TX 76798-7140
Rakheon Kim
Rakheon Kim
Baylor University
D
Dean M. Young
Department of Statistical Science, Baylor University, Waco, TX 76798-7140