Unify and Anchor: A Context-Aware Transformer for Cross-Domain Time Series Forecasting

📅 2025-03-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Cross-domain time-series forecasting faces two key challenges: complex temporal pattern heterogeneity and semantic misalignment across domains. To address these, we propose a “unify-and-anchor” transfer paradigm: first, frequency-domain disentanglement enables unified cross-domain time-series representation to mitigate pattern heterogeneity; second, external contextual cues are introduced as learnable domain anchors to guide adaptive semantic alignment. We design a Temporal Coordinator and a Context-Aware Mixture-of-Experts Transformer architecture, enabling zero-shot cross-domain generalization. Our approach integrates frequency-domain modeling, context-enhanced representation learning, and domain-adaptive feature extraction. Extensive experiments on heterogeneous multi-source benchmarks—including power, traffic, and healthcare domains—demonstrate consistent superiority over state-of-the-art methods, achieving an average 12.7% improvement in zero-shot transfer performance. Results validate both the strong generalization capability of our framework and the effectiveness of its architectural design.

Technology Category

Application Category

📝 Abstract
The rise of foundation models has revolutionized natural language processing and computer vision, yet their best practices to time series forecasting remains underexplored. Existing time series foundation models often adopt methodologies from these fields without addressing the unique characteristics of time series data. In this paper, we identify two key challenges in cross-domain time series forecasting: the complexity of temporal patterns and semantic misalignment. To tackle these issues, we propose the ``Unify and Anchor"transfer paradigm, which disentangles frequency components for a unified perspective and incorporates external context as domain anchors for guided adaptation. Based on this framework, we introduce ContexTST, a Transformer-based model that employs a time series coordinator for structured representation and the Transformer blocks with a context-informed mixture-of-experts mechanism for effective cross-domain generalization. Extensive experiments demonstrate that ContexTST advances state-of-the-art forecasting performance while achieving strong zero-shot transferability across diverse domains.
Problem

Research questions and friction points this paper is trying to address.

Addresses cross-domain time series forecasting challenges.
Tackles temporal pattern complexity and semantic misalignment.
Proposes a unified model for effective domain adaptation.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unify and Anchor transfer paradigm
ContexTST Transformer-based model
Context-informed mixture-of-experts mechanism
🔎 Similar Papers
No similar papers found.
Xiaobin Hong
Xiaobin Hong
Nanjing University
Graph MiningTime Series AnalysisLLM ReasoningAI4Science
Jiawen Zhang
Jiawen Zhang
The Hong Kong University of Science and Technology
Time SeriesKnowledge GraphAIHCI
W
Wenzhong Li
Nanjing University, Nanjing, China
S
Sanglu Lu
Nanjing University, Nanjing, China
J
Jia Li
HKUST(GZ), Guangzhou, China