High-Dimensional Tensor Discriminant Analysis: Low-Rank Discriminant Structure, Representation Synergy, and Theoretical Guarantees

📅 2025-12-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address challenges in high-dimensional tensor classification—including difficulty modeling low-rank discriminative structures, lack of theoretical guarantees, and performance degradation under small-sample regimes—this paper proposes the first semi-parametric tensor discriminant analysis framework grounded in CP low-rank structure. Methodologically, it introduces CP decomposition to explicitly capture multilinear sparsity of discriminative signals; integrates randomized composite PCA initialization with an iterative refinement algorithm; and ensures robustness against anisotropic and dependent noise. Theoretically, it establishes global convergence guarantees and derives minimax-optimal misclassification rate bounds. Empirically, on real-world tasks such as graph classification—particularly in high-dimensional, low-sample settings—the method reduces misclassification rates by up to 32% compared to state-of-the-art tensor classifiers and graph neural networks.

Technology Category

Application Category

📝 Abstract
High-dimensional tensor-valued predictors arise in modern applications, increasingly as learned representations from neural networks. Existing tensor classification methods rely on sparsity or Tucker structures and often lack theoretical guarantees. Motivated by empirical evidence that discriminative signals concentrate along a few multilinear components, we introduce CP low-rank structure for the discriminant tensor, a modeling perspective not previously explored. Under a Tensor Gaussian Mixture Model, we propose high-dimensional CP low-rank Tensor Discriminant Analysis (CP-TDA) with Randomized Composite PCA ( extsc{rc-PCA}) initialization, that is essential for handling dependent and anisotropic noise under weaker signal strength and incoherence conditions, followed by iterative refinement algorithm. We establish global convergence and minimax-optimal misclassification rates. To handle tensor data deviating from tensor normality, we develop the first semiparametric tensor discriminant model, in which learned tensor representations are mapped via deep generative models into a latent space tailored for CP-TDA. Misclassification risk decomposes into representation, approximation, and estimation errors. Numerical studies and real data analysis on graph classification demonstrate substantial gains over existing tensor classifiers and state-of-the-art graph neural networks, particularly in high-dimensional, small-sample regimes.
Problem

Research questions and friction points this paper is trying to address.

Develops CP low-rank tensor discriminant analysis for high-dimensional tensor classification
Establishes theoretical guarantees for global convergence and optimal misclassification rates
Proposes semiparametric model integrating deep generative representations for non-normal data
Innovation

Methods, ideas, or system contributions that make the work stand out.

CP low-rank structure for discriminant tensor modeling
Randomized Composite PCA initialization for noise handling
Semiparametric model with deep generative latent mapping
🔎 Similar Papers