🤖 AI Summary
Existing time series benchmarks lack interventional data, hindering the training of causal foundation models. To address this gap, this work proposes CausalTimePrior, a framework that introduces the first synthetic time series structural causal model (TSCM) capable of generating paired observational and interventional data. The framework supports configurable causal graphs, nonlinear autoregressive mechanisms, state-switching dynamics, and diverse intervention types—including hard, soft, and time-varying interventions. Prior-data fitting networks (PFNs) trained within this framework demonstrate effective in-context estimation of causal effects on unseen TSCMs, underscoring the framework’s pivotal role in advancing causal foundation models for time series.
📝 Abstract
Prior-data fitted networks (PFNs) have emerged as powerful foundation models for tabular causal inference, yet their extension to time series remains limited by the absence of synthetic data generators that provide interventional targets. Existing time series benchmarks generate observational data with ground-truth causal graphs but lack the interventional data required for training causal foundation models. To address this, we propose \textbf{CausalTimePrior}, a principled framework for generating synthetic temporal structural causal models (TSCMs) with paired observational and interventional time series. Our prior supports configurable causal graph structures, nonlinear autoregressive mechanisms, regime-switching dynamics, and multiple intervention types (hard, soft, time-varying). We demonstrate that PFNs trained on CausalTimePrior can perform in-context causal effect estimation on held-out TSCMs, establishing a pathway toward foundation models for time series causal inference.