A Unified Contrastive-Generative Framework for Time Series Classification

πŸ“… 2025-08-12
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
In multivariate time series classification, contrastive learning suffers from intra-class high similarity, while generative methods rely heavily on large-scale data. To address these issues, this paper proposes CoGenTβ€”the first end-to-end framework that unifies contrastive and generative objectives. CoGenT innovatively co-optimizes the SimCLR contrastive loss and masked autoencoder (MAE) reconstruction loss within a single model, jointly enhancing instance discrimination and data distribution modeling. Experiments across six benchmark datasets demonstrate that CoGenT significantly outperforms single-paradigm baselines: its F1 score improves by up to 59.2% over SimCLR and 14.27% over MAE. The framework achieves both strong discriminative capability and generation robustness, effectively mitigating challenges posed by few-shot learning and intra-class confusion.

Technology Category

Application Category

πŸ“ Abstract
Self-supervised learning (SSL) for multivariate time series mainly includes two paradigms: contrastive methods that excel at instance discrimination and generative approaches that model data distributions. While effective individually, their complementary potential remains unexplored. We propose a Contrastive Generative Time series framework (CoGenT), the first framework to unify these paradigms through joint contrastive-generative optimization. CoGenT addresses fundamental limitations of both approaches: it overcomes contrastive learning's sensitivity to high intra-class similarity in temporal data while reducing generative methods' dependence on large datasets. We evaluate CoGenT on six diverse time series datasets. The results show consistent improvements, with up to 59.2% and 14.27% F1 gains over standalone SimCLR and MAE, respectively. Our analysis reveals that the hybrid objective preserves discriminative power while acquiring generative robustness. These findings establish a foundation for hybrid SSL in temporal domains. We will release the code shortly.
Problem

Research questions and friction points this paper is trying to address.

Unifies contrastive and generative SSL for time series classification
Addresses high intra-class similarity in contrastive temporal learning
Reduces generative methods' reliance on large datasets
Innovation

Methods, ideas, or system contributions that make the work stand out.

Unifies contrastive and generative learning paradigms
Overcomes sensitivity to high intra-class similarity
Reduces dependence on large datasets
πŸ”Ž Similar Papers
No similar papers found.