🤖 AI Summary
In contrastive learning for time series, conventional data augmentation often disrupts seasonal patterns and temporal dependencies, leading to semantic distortion. To address this, this paper introduces persistent homology—the first such application in general time series representation learning—and proposes a dual-modality (temporal + topological) joint optimization framework. Our method explicitly captures the global topological structure of time series via persistent diagram embeddings and designs a dual-modality contrastive loss that balances augmentation invariance with semantic fidelity. Evaluated across four fundamental tasks—classification, anomaly detection, forecasting, and transfer learning—the approach consistently outperforms state-of-the-art methods, achieving new SOTA performance. Empirical results demonstrate substantial improvements in both structural coherence and semantic robustness of learned representations.
📝 Abstract
Universal time series representation learning is challenging but valuable in real-world applications such as classification, anomaly detection, and forecasting. Recently, contrastive learning (CL) has been actively explored to tackle time series representation. However, a key challenge is that the data augmentation process in CL can distort seasonal patterns or temporal dependencies, inevitably leading to a loss of semantic information. To address this challenge, we propose Topological Contrastive Learning for time series (TopoCL). TopoCL mitigates such information loss by incorporating persistent homology, which captures the topological characteristics of data that remain invariant under transformations. In this paper, we treat the temporal and topological properties of time series data as distinct modalities. Specifically, we compute persistent homology to construct topological features of time series data, representing them in persistence diagrams. We then design a neural network to encode these persistent diagrams. Our approach jointly optimizes CL within the time modality and time-topology correspondence, promoting a comprehensive understanding of both temporal semantics and topological properties of time series. We conduct extensive experiments on four downstream tasks-classification, anomaly detection, forecasting, and transfer learning. The results demonstrate that TopoCL achieves state-of-the-art performance.