TS-ACL: Closed-Form Solution for Time Series-oriented Continual Learning

๐Ÿ“… 2024-10-21
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Time-series class-incremental learning (TSCIL) confronts two core challenges: catastrophic forgetting and intra-class variabilityโ€”yet existing sample-free methods struggle to address both simultaneously. This paper introduces the first closed-form continual learning framework tailored for time-series data: it abandons gradient-based parameter updates in favor of analytical solutions for stable knowledge consolidation; explicitly models intra-class variability via global distribution estimation to capture subject-specific temporal patterns; and operates entirely without storing historical samples, thereby preserving data privacy and enhancing computational efficiency. Evaluated on five mainstream multimodal time-series benchmarks, our method achieves performance close to joint training, attains state-of-the-art results on four key metrics, and substantially raises the performance ceiling for TSCIL.

Technology Category

Application Category

๐Ÿ“ Abstract
Time series classification underpins critical applications such as healthcare diagnostics and gesture-driven interactive systems in multimedia scenarios. However, time series class-incremental learning (TSCIL) faces two major challenges: catastrophic forgetting and intra-class variations. Catastrophic forgetting occurs because gradient-based parameter update strategies inevitably erase past knowledge. And unlike images, time series data exhibits subject-specific patterns, also known as intra-class variations, which refer to differences in patterns observed within the same class. While exemplar-based methods fail to cover diverse variation with limited samples, existing exemplar-free methods lack explicit mechanisms to handle intra-class variations. To address these two challenges, we propose TS-ACL, which leverages a gradient-free closed-form solution to avoid the catastrophic forgetting problem inherent in gradient-based optimization methods while simultaneously learning global distributions to resolve intra-class variations. Additionally, it provides privacy protection and efficiency. Extensive experiments on five benchmark datasets covering various sensor modalities and tasks demonstrate that TS-ACL achieves performance close to joint training on four datasets, outperforming existing methods and establishing a new state-of-the-art (SOTA) for TSCIL.
Problem

Research questions and friction points this paper is trying to address.

Addresses catastrophic forgetting in time series class-incremental learning
Resolves intra-class variations in time series data patterns
Provides privacy protection and efficiency in continual learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Closed-form solution avoids catastrophic forgetting
Learns global distributions for intra-class variations
Provides privacy protection and efficiency
๐Ÿ”Ž Similar Papers
No similar papers found.