Compressed Bayesian Tensor Regression

📅 2025-10-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the curse of dimensionality in high-dimensional tensor regression, this paper proposes Generalized Tensor Random Projection (GTRP), which adaptively embeds high-dimensional covariates into a low-dimensional subspace within a Bayesian inference framework. GTRP innovatively integrates tensor random projection with Bayesian model averaging, enabling tensor-level, mode-level, and hybrid compression. Hierarchical priors and low-rank parameterization are introduced to enhance interpretability and estimation stability. Efficient posterior inference and compressed learning are achieved via Gibbs sampling combined with inverse logistic regression for normalizing constant estimation. Theoretical analysis establishes statistical consistency of the estimator. Empirical results demonstrate that GTRP significantly reduces computational cost while achieving superior predictive accuracy compared to standard Bayesian tensor regression.

Technology Category

Application Category

📝 Abstract
To address the common problem of high dimensionality in tensor regressions, we introduce a generalized tensor random projection method that embeds high-dimensional tensor-valued covariates into low-dimensional subspaces with minimal loss of information about the responses. The method is flexible, allowing for tensor-wise, mode-wise, or combined random projections as special cases. A Bayesian inference framework is provided featuring the use of a hierarchical prior distribution and a low-rank representation of the parameter. Strong theoretical support is provided for the concentration properties of the random projection and posterior consistency of the Bayesian inference. An efficient Gibbs sampler is developed to perform inference on the compressed data. To mitigate the sensitivity introduced by random projections, Bayesian model averaging is employed, with normalising constants estimated using reverse logistic regression. An extensive simulation study is conducted to examine the effects of different tuning parameters. Simulations indicate, and the real data application confirms, that compressed Bayesian tensor regression can achieve better out-of-sample prediction while significantly reducing computational cost compared to standard Bayesian tensor regression.
Problem

Research questions and friction points this paper is trying to address.

Reduces high-dimensional tensor regression computational costs
Minimizes information loss through generalized random projections
Enhances prediction accuracy with Bayesian model averaging
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generalized tensor random projection reduces dimensionality
Bayesian inference uses hierarchical prior and low-rank representation
Gibbs sampler and model averaging enhance compressed data analysis
🔎 Similar Papers
No similar papers found.
R
Roberto Casarin
Ca’ Foscari University of Venice, Italy
Radu Craiu
Radu Craiu
Professor of Statistics, University of Toronto
Markov chain Monte CarloCopula ModelingBayesian StatisticsCopulasModel Selection
Q
Qing Wang
Ca’ Foscari University of Venice, Italy