🤖 AI Summary
To address the curse of dimensionality in high-dimensional tensor regression, this paper proposes Generalized Tensor Random Projection (GTRP), which adaptively embeds high-dimensional covariates into a low-dimensional subspace within a Bayesian inference framework. GTRP innovatively integrates tensor random projection with Bayesian model averaging, enabling tensor-level, mode-level, and hybrid compression. Hierarchical priors and low-rank parameterization are introduced to enhance interpretability and estimation stability. Efficient posterior inference and compressed learning are achieved via Gibbs sampling combined with inverse logistic regression for normalizing constant estimation. Theoretical analysis establishes statistical consistency of the estimator. Empirical results demonstrate that GTRP significantly reduces computational cost while achieving superior predictive accuracy compared to standard Bayesian tensor regression.
📝 Abstract
To address the common problem of high dimensionality in tensor regressions, we introduce a generalized tensor random projection method that embeds high-dimensional tensor-valued covariates into low-dimensional subspaces with minimal loss of information about the responses. The method is flexible, allowing for tensor-wise, mode-wise, or combined random projections as special cases. A Bayesian inference framework is provided featuring the use of a hierarchical prior distribution and a low-rank representation of the parameter. Strong theoretical support is provided for the concentration properties of the random projection and posterior consistency of the Bayesian inference. An efficient Gibbs sampler is developed to perform inference on the compressed data. To mitigate the sensitivity introduced by random projections, Bayesian model averaging is employed, with normalising constants estimated using reverse logistic regression. An extensive simulation study is conducted to examine the effects of different tuning parameters. Simulations indicate, and the real data application confirms, that compressed Bayesian tensor regression can achieve better out-of-sample prediction while significantly reducing computational cost compared to standard Bayesian tensor regression.