Sequencing to Mitigate Catastrophic Forgetting in Continual Learning

📅 2025-12-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
In continual learning, models suffer from catastrophic forgetting when acquiring new tasks. This paper introduces task sequence ordering as an independent intervention dimension—first proposed in the literature—and presents a zero-shot scoring method to identify optimal task orderings. By modeling task similarity and adapting neural architecture search (NAS) heuristics, our approach estimates cross-task transfer potential without fine-tuning. It seamlessly integrates with mainstream continual learning strategies—including Elastic Weight Consolidation (EWC) and experience replay—yielding substantial mitigation of forgetting: average accuracy improves by 3.2–7.8% across multiple benchmarks, while retention of previously learned tasks increases by 12.4–29.6%. Moreover, it enhances the robustness and generalization of diverse baseline methods. The core innovation lies in decoupling task-order optimization from model parameter updates, establishing a novel paradigm for continual learning that separates sequencing logic from learning dynamics.

Technology Category

Application Category

📝 Abstract
To cope with real-world dynamics, an intelligent system needs to incrementally acquire, update, and exploit knowledge throughout its lifetime. This ability, known as Continual learning, provides a foundation for AI systems to develop themselves adaptively. Catastrophic forgetting is a major challenge to the progress of Continual Learning approaches, where learning a new task usually results in a dramatic performance drop on previously learned ones. Many approaches have emerged to counteract the impact of CF. Most of the proposed approaches can be categorized into five classes: replay-based, regularization-based, optimization-based, representation-based, and architecture-based. In this work, we approach the problem from a different angle, specifically by considering the optimal sequencing of tasks as they are presented to the model. We investigate the role of task sequencing in mitigating CF and propose a method for determining the optimal task order. The proposed method leverages zero-shot scoring algorithms inspired by neural architecture search (NAS). Results demonstrate that intelligent task sequencing can substantially reduce CF. Moreover, when combined with traditional continual learning strategies, sequencing offers enhanced performance and robustness against forgetting. Additionally, the presented approaches can find applications in other fields, such as curriculum learning.
Problem

Research questions and friction points this paper is trying to address.

Mitigates catastrophic forgetting in continual learning
Determines optimal task sequencing to reduce forgetting
Enhances performance with traditional continual learning strategies
Innovation

Methods, ideas, or system contributions that make the work stand out.

Leverages task sequencing to mitigate forgetting
Uses zero-shot scoring algorithms from NAS
Combines sequencing with traditional continual learning strategies
H
Hesham G. Moussa
wireless department at Huawei Technologies Canada, Kanata, Canada
Aroosa Hameed
Aroosa Hameed
Huawei Technologies
Wireless CommunicationArtificial IntelligenceDistributed LearningIoTEdge Computing
A
Arashmid Akhavain
wireless department at Huawei Technologies Canada, Kanata, Canada