Toward Temporal Causal Representation Learning with Tensor Decomposition

📅 2025-07-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
For high-dimensional, variable-length, irregularly sampled tensor-form temporal data (e.g., electronic health records), this paper proposes CaRTeD—a novel framework that jointly models time-aware causal representation learning and irregular tensor decomposition for the first time. Methodologically, CaRTeD integrates a provably convergent tensor decomposition module with dynamic causal structure learning and employs a flexible regularization scheme to jointly optimize latent cluster identification, temporal evolution modeling, and causal relationship inference. Theoretically, it establishes the first convergence guarantee for irregular tensor decomposition. Empirically, experiments on MIMIC-III and synthetic datasets demonstrate that CaRTeD significantly outperforms state-of-the-art methods in phenotyping discovery and causal network recovery—achieving both higher accuracy and enhanced interpretability.

Technology Category

Application Category

📝 Abstract
Temporal causal representation learning is a powerful tool for uncovering complex patterns in observational studies, which are often represented as low-dimensional time series. However, in many real-world applications, data are high-dimensional with varying input lengths and naturally take the form of irregular tensors. To analyze such data, irregular tensor decomposition is critical for extracting meaningful clusters that capture essential information. In this paper, we focus on modeling causal representation learning based on the transformed information. First, we present a novel causal formulation for a set of latent clusters. We then propose CaRTeD, a joint learning framework that integrates temporal causal representation learning with irregular tensor decomposition. Notably, our framework provides a blueprint for downstream tasks using the learned tensor factors, such as modeling latent structures and extracting causal information, and offers a more flexible regularization design to enhance tensor decomposition. Theoretically, we show that our algorithm converges to a stationary point. More importantly, our results fill the gap in theoretical guarantees for the convergence of state-of-the-art irregular tensor decomposition. Experimental results on synthetic and real-world electronic health record (EHR) datasets (MIMIC-III), with extensive benchmarks from both phenotyping and network recovery perspectives, demonstrate that our proposed method outperforms state-of-the-art techniques and enhances the explainability of causal representations.
Problem

Research questions and friction points this paper is trying to address.

Learning temporal causal representations from high-dimensional irregular tensor data
Integrating tensor decomposition with causal learning for latent cluster modeling
Providing theoretical convergence guarantees for irregular tensor decomposition methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrates temporal causal learning with tensor decomposition
Proposes flexible regularization for tensor decomposition
Ensures convergence with theoretical guarantees
🔎 Similar Papers
No similar papers found.
J
Jianhong Chen
Department of Mechanical & Industrial Engineering, Northeastern University, Boston, MA, USA
M
Meng Zhao
Department of Industrial and Systems Engineering, Lehigh University, Bethlehem, PA, USA
M
Mostafa Reisi Gahrooei
Department of Industrial and Systems Engineering, University of Florida, Gainesville, FL, USA
Xubo Yue
Xubo Yue
Assistant Professor, Northeastern University
Causal learningGaussian processBayesian Optimizationfederated learning