Bayesian Adaptive Tucker Decompositions for Tensor Factorization

📅 2024-11-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Tucker decomposition requires manual pre-specification of multiranks, compromising the balance between model parsimony and structural expressiveness. Method: We propose a Bayesian adaptive Tucker decomposition model featuring (i) an infinite递增收缩 prior for fully automatic, data-driven multirank inference; (ii) a local sparsity prior on the core tensor to jointly capture inter-variable dependencies and intrinsic low-dimensional structure; and (iii) unified handling of continuous/binary data with joint missing-value imputation, implemented via adaptive Gibbs sampling for computational efficiency. Contribution/Results: We establish posterior consistency theoretically. Empirical evaluation on chemometric and complex ecological datasets demonstrates substantial improvements over state-of-the-art methods in imputation accuracy, robustness, and interpretability—without requiring rank tuning.

Technology Category

Application Category

📝 Abstract
Tucker tensor decomposition offers a more effective representation for multiway data compared to the widely used PARAFAC model. However, its flexibility brings the challenge of selecting the appropriate latent multi-rank. To overcome the issue of pre-selecting the latent multi-rank, we introduce a Bayesian adaptive Tucker decomposition model that infers the multi-rank automatically via an infinite increasing shrinkage prior. The model introduces local sparsity in the core tensor, inducing rich and at the same time parsimonious dependency structures. Posterior inference proceeds via an efficient adaptive Gibbs sampler, supporting both continuous and binary data and allowing for straightforward missing data imputation when dealing with incomplete multiway data. We discuss fundamental properties of the proposed modeling framework, providing theoretical justification. Simulation studies and applications to chemometrics and complex ecological data offer compelling evidence of its advantages over existing tensor factorization methods.
Problem

Research questions and friction points this paper is trying to address.

Automatically infer latent multi-rank in Tucker decomposition
Handle both continuous and binary incomplete multiway data
Improve tensor factorization with Bayesian adaptive shrinkage prior
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bayesian adaptive Tucker decomposition model
Infinite increasing shrinkage prior
Efficient adaptive Gibbs sampler
🔎 Similar Papers
No similar papers found.
F
Federica Stolf
Department of Statistical Science, Duke University, Durham, NC, USA
Antonio Canale
Antonio Canale
Associate professor, University of Padova
Bayesian nonparametricsFunctional Data AnalysisFlexible distributions