Transfer Learning for High-dimensional Reduced Rank Time Series Models

📅 2025-04-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the challenge of vector autoregressive (VAR) modeling for high-dimensional time series under small-sample settings. We propose the first transfer learning framework tailored to low-rank plus sparse VAR models. By incorporating temporally dependent auxiliary data and designing an information-driven weighted sample selection mechanism, our approach mitigates severe parameter estimation bias and unreliable inference in the target domain. Theoretically, we establish— for the first time under this transfer paradigm— asymptotic normality and variable selection consistency, enabling construction of element-wise confidence intervals. Algorithmically, our method achieves both computational efficiency and statistical optimality. Extensive simulations and empirical applications in finance and macroeconomics demonstrate substantial improvements over state-of-the-art methods. The framework provides a novel paradigm for small-sample inference in high-dimensional dynamic systems.

Technology Category

Application Category

📝 Abstract
The objective of transfer learning is to enhance estimation and inference in a target data by leveraging knowledge gained from additional sources. Recent studies have explored transfer learning for independent observations in complex, high-dimensional models assuming sparsity, yet research on time series models remains limited. Our focus is on transfer learning for sequences of observations with temporal dependencies and a more intricate model parameter structure. Specifically, we investigate the vector autoregressive model (VAR), a widely recognized model for time series data, where the transition matrix can be deconstructed into a combination of a sparse matrix and a low-rank one. We propose a new transfer learning algorithm tailored for estimating high-dimensional VAR models characterized by low-rank and sparse structures. Additionally, we present a novel approach for selecting informative observations from auxiliary datasets. Theoretical guarantees are established, encompassing model parameter consistency, informative set selection, and the asymptotic distribution of estimators under mild conditions. The latter facilitates the construction of entry-wise confidence intervals for model parameters. Finally, we demonstrate the empirical efficacy of our methodologies through both simulated and real-world datasets.
Problem

Research questions and friction points this paper is trying to address.

Enhancing time series estimation via transfer learning
Estimating high-dimensional VAR models with sparse structures
Selecting informative observations from auxiliary datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Transfer learning for high-dimensional VAR models
Combines sparse and low-rank matrix structures
Novel informative auxiliary data selection
🔎 Similar Papers
No similar papers found.