🤖 AI Summary
High-dimensional tensor-valued time series autoregression faces a trade-off: Tucker decomposition offers strong interpretability but suffers from computational inefficiency, whereas CP decomposition is efficient yet lacks clear statistical interpretability; moreover, strict low-rank assumptions often lead to model misspecification.
Method: We propose a CP-decomposition-driven supervised factor regression framework: first extracting low-dimensional CP features from both responses and covariates, then modeling their mapping via a structured coefficient tensor incorporating low-rank plus sparse components to accommodate heterogeneous signals.
Contribution/Results: Our approach innovatively integrates CP decomposition with supervised factor modeling to yield an interpretable, structured coefficient tensor. We establish non-asymptotic theoretical guarantees for estimation consistency and accuracy. Extensive simulations and an ENSO forecasting application demonstrate that the method achieves significantly improved predictive performance while preserving strong interpretability.
📝 Abstract
In autoregressive modeling for tensor-valued time series, Tucker decomposition, when applied to the coefficient tensor, provides a clear interpretation of supervised factor modeling but loses its efficiency rapidly with increasing tensor order. Conversely, canonical polyadic (CP) decomposition maintains efficiency but lacks a precise statistical interpretation. To attain both interpretability and powerful dimension reduction, this paper proposes a novel approach under the supervised factor modeling paradigm, which first uses CP decomposition to extract response and covariate features separately and then regresses response features on covariate ones. This leads to a new CP-based low-rank structure for the coefficient tensor. Furthermore, to address heterogeneous signals or potential model misspecifications arising from stringent low-rank assumptions, a low-rank plus sparse model is introduced by incorporating an additional sparse coefficient tensor. Nonasymptotic properties are established for the ordinary least squares estimators, and an alternating least squares algorithm is introduced for optimization. Theoretical properties of the proposed methodology are validated by simulation studies, and its enhanced prediction performance and interpretability are demonstrated by the El Ni$ ilde{ ext{n}}$o-Southern Oscillation example.