Prototypical Exemplar Condensation for Memory-efficient Online Continual Learning

📅 2026-03-14
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the high memory overhead and privacy concerns of conventional replay-based methods in online continual learning, which typically require storing large volumes of historical data. To mitigate catastrophic forgetting under stringent memory constraints, the authors propose a prototype-based compressed replay strategy that synthesizes a small set of representative prototype samples per class and augments them via a perturbation mechanism to generate diverse synthetic variants. By integrating prototype synthesis, feature extraction, and perturbation-based augmentation, the method drastically reduces storage requirements while preserving data privacy. Extensive experiments on multiple benchmarks and large-scale multitask settings demonstrate that the approach consistently outperforms existing replay techniques—even when retaining only a minimal number of samples per class—thereby achieving superior performance with significantly lower memory consumption.

Technology Category

Application Category

📝 Abstract
Rehearsal-based continual learning (CL) mitigates catastrophic forgetting by maintaining a subset of samples from previous tasks for replay. Existing studies primarily focus on optimizing memory storage through coreset selection strategies. While these methods are effective, they typically require storing a substantial number of samples per class (SPC), often exceeding 20, to maintain satisfactory performance. In this work, we propose to further compress the memory footprint by synthesizing and storing prototypical exemplars, which can form representative prototypes when passed through the feature extractor. Owing to their representative nature, these exemplars enable the model to retain previous knowledge using only a small number of samples while preserving privacy. Moreover, we introduce a perturbation-based augmentation mechanism that generates synthetic variants of previous data during training, thereby enhancing CL performance. Extensive evaluations on widely used benchmark datasets and settings demonstrate that the proposed algorithm achieves superior performance compared to existing baselines, particularly in scenarios involving large-scale datasets and a high number of tasks.
Problem

Research questions and friction points this paper is trying to address.

continual learning
catastrophic forgetting
memory efficiency
exemplar condensation
online learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

prototypical exemplars
memory-efficient
online continual learning
synthetic data augmentation
catastrophic forgetting
🔎 Similar Papers
No similar papers found.
Minh-Duong Nguyen
Minh-Duong Nguyen
College of Engineering and Computer Science, VinUniversity
Federated LearningContinual LearningMTLDomain GeneralizationInformation Theory
T
Thien-Thanh Dao
Faculty of Information Systems, Phenikaa University, Hanoi, Vietnam
L
Le-Tuan Nguyen
College of Engineering and Computer Science, VinUniversity, Hanoi, Vietnam
D
Dung D. Le
College of Engineering and Computer Science, VinUniversity, Hanoi, Vietnam
K
Kok-Seng Wong
College of Engineering and Computer Science, VinUniversity, Hanoi, Vietnam