Online Continual Learning: A Systematic Literature Review of Approaches, Challenges, and Benchmarks

📅 2025-01-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This survey addresses core challenges in online continual learning (OCL)—catastrophic forgetting, stability-plasticity trade-off imbalance, high computational overhead, and weak cross-domain generalization. We conduct the most comprehensive systematic literature review (SLR) to date, analyzing 81 OCL methods across 83 multimodal datasets and over 1,000 task characteristics. We propose a novel, full-dimensional SLR framework covering algorithms, modular components, and resource constraints; introduce standardized feature extraction and modular decomposition to expose scalability bottlenecks under resource limitations; and identify an emerging optimization pathway integrating self-supervised pretraining, sparse retrieval, and generative replay. Our work yields a structured OCL knowledge graph, pinpointing three critical bottlenecks: computational efficiency, dynamic task boundary handling, and domain generalization. We provide an open-source reproducibility pipeline (GitHub) and empirically grounded design guidelines—establishing a foundational benchmark for both algorithmic innovation and real-world deployment.

Technology Category

Application Category

📝 Abstract
Online Continual Learning (OCL) is a critical area in machine learning, focusing on enabling models to adapt to evolving data streams in real-time while addressing challenges such as catastrophic forgetting and the stability-plasticity trade-off. This study conducts the first comprehensive Systematic Literature Review (SLR) on OCL, analyzing 81 approaches, extracting over 1,000 features (specific tasks addressed by these approaches), and identifying more than 500 components (sub-models within approaches, including algorithms and tools). We also review 83 datasets spanning applications like image classification, object detection, and multimodal vision-language tasks. Our findings highlight key challenges, including reducing computational overhead, developing domain-agnostic solutions, and improving scalability in resource-constrained environments. Furthermore, we identify promising directions for future research, such as leveraging self-supervised learning for multimodal and sequential data, designing adaptive memory mechanisms that integrate sparse retrieval and generative replay, and creating efficient frameworks for real-world applications with noisy or evolving task boundaries. By providing a rigorous and structured synthesis of the current state of OCL, this review offers a valuable resource for advancing this field and addressing its critical challenges and opportunities. The complete SLR methodology steps and extracted data are publicly available through the provided link: https://github.com/kiyan-rezaee/ Systematic-Literature-Review-on-Online-Continual-Learning
Problem

Research questions and friction points this paper is trying to address.

Online Continuous Learning
Knowledge Retention
Resource-Efficient Learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Online Continual Learning
Comprehensive Review
Memory Mechanisms
🔎 Similar Papers
No similar papers found.