🤖 AI Summary
This paper addresses the challenge of state-switching dynamics in latent factors of tensor time series. We propose the Threshold Tensor Factor Model (TTFM), which integrates a threshold autoregressive structure into the CANDECOMP/PARAFAC (CP) decomposition framework—enabling low-rank tensor representations to capture nonlinear regime shifts while preserving interpretability. Methodologically, TTFM jointly estimates factor loadings, latent factor trajectories, and threshold parameters via iterative optimization and statistical inference. We establish asymptotic consistency and convergence rates for all parameter estimators. Simulation studies and empirical analysis on real-world data demonstrate that TTFM significantly outperforms conventional linear tensor factor models in both in-sample fit and out-of-sample rolling forecasting accuracy. By unifying nonlinear dynamics with interpretable tensor decomposition, TTFM offers a novel paradigm for nonlinear dimensionality reduction and dynamic modeling of high-dimensional time series.
📝 Abstract
This paper proposes a new Threshold Tensor Factor Model in Canonical Polyadic (CP) form for tensor time series. By integrating a thresholding autoregressive structure for the latent factor process into the tensor factor model in CP form, the model captures regime-switching dynamics in the latent factor processes while retaining the parsimony and interpretability of low-rank tensor representations. We develop estimation procedures for the model and establish the theoretical properties of the resulting estimators. Numerical experiments and a real-data application illustrate the practical performance and usefulness of the proposed framework.