๐ค AI Summary
Time-series class-incremental learning (TSCIL) confronts two core challenges: catastrophic forgetting and intra-class variabilityโyet existing sample-free methods struggle to address both simultaneously. This paper introduces the first closed-form continual learning framework tailored for time-series data: it abandons gradient-based parameter updates in favor of analytical solutions for stable knowledge consolidation; explicitly models intra-class variability via global distribution estimation to capture subject-specific temporal patterns; and operates entirely without storing historical samples, thereby preserving data privacy and enhancing computational efficiency. Evaluated on five mainstream multimodal time-series benchmarks, our method achieves performance close to joint training, attains state-of-the-art results on four key metrics, and substantially raises the performance ceiling for TSCIL.
๐ Abstract
Time series classification underpins critical applications such as healthcare diagnostics and gesture-driven interactive systems in multimedia scenarios. However, time series class-incremental learning (TSCIL) faces two major challenges: catastrophic forgetting and intra-class variations. Catastrophic forgetting occurs because gradient-based parameter update strategies inevitably erase past knowledge. And unlike images, time series data exhibits subject-specific patterns, also known as intra-class variations, which refer to differences in patterns observed within the same class. While exemplar-based methods fail to cover diverse variation with limited samples, existing exemplar-free methods lack explicit mechanisms to handle intra-class variations. To address these two challenges, we propose TS-ACL, which leverages a gradient-free closed-form solution to avoid the catastrophic forgetting problem inherent in gradient-based optimization methods while simultaneously learning global distributions to resolve intra-class variations. Additionally, it provides privacy protection and efficiency. Extensive experiments on five benchmark datasets covering various sensor modalities and tasks demonstrate that TS-ACL achieves performance close to joint training on four datasets, outperforming existing methods and establishing a new state-of-the-art (SOTA) for TSCIL.