Semi-parametric Memory Consolidation: Towards Brain-like Deep Continual Learning

📅 2025-04-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Deep neural networks suffer from catastrophic forgetting when sequentially learning new tasks, hindering their continual adaptation in open-world environments. To address this, we propose a biologically inspired continual learning framework that integrates semi-parametric memory with a brain-inspired “wake–sleep” synaptic consolidation mechanism. During the “wake” phase, task-specific knowledge is dynamically written into a semi-parametric memory module; during the “sleep” phase, offline replay and selective synaptic freezing consolidate previously learned representations without requiring raw data storage. The framework is fully compatible with class-incremental learning. Evaluated on realistic benchmarks including ImageNet, our method significantly mitigates forgetting—reducing average forgetting by 32%—while simultaneously improving both forward transfer (new-task accuracy) and backward transfer (old-task stability). To our knowledge, this is the first approach to jointly optimize parameter efficiency, neurobiological plausibility, and task performance in continual learning.

Technology Category

Application Category

📝 Abstract
Humans and most animals inherently possess a distinctive capacity to continually acquire novel experiences and accumulate worldly knowledge over time. This ability, termed continual learning, is also critical for deep neural networks (DNNs) to adapt to the dynamically evolving world in open environments. However, DNNs notoriously suffer from catastrophic forgetting of previously learned knowledge when trained on sequential tasks. In this work, inspired by the interactive human memory and learning system, we propose a novel biomimetic continual learning framework that integrates semi-parametric memory and the wake-sleep consolidation mechanism. For the first time, our method enables deep neural networks to retain high performance on novel tasks while maintaining prior knowledge in real-world challenging continual learning scenarios, e.g., class-incremental learning on ImageNet. This study demonstrates that emulating biological intelligence provides a promising path to enable deep neural networks with continual learning capabilities.
Problem

Research questions and friction points this paper is trying to address.

Address catastrophic forgetting in deep neural networks
Enable continual learning in dynamic environments
Integrate biomimetic memory for knowledge retention
Innovation

Methods, ideas, or system contributions that make the work stand out.

Semi-parametric memory for continual learning
Wake-sleep consolidation mechanism integration
Biomimetic framework for deep neural networks
🔎 Similar Papers
No similar papers found.
Geng Liu
Geng Liu
The Greater Bay Area National Center of Technology Innovation
High-Performance ComputingCFDLBM
F
Fei Zhu
Centre for Artificial Intelligence and Robotics, Hong Kong Institute of Science and Innovation, Chinese Academy of Sciences, Hong Kong SAR, China.
R
Rong Feng
Centre for Artificial Intelligence and Robotics, Hong Kong Institute of Science and Innovation, Chinese Academy of Sciences, Hong Kong SAR, China.
Z
Zhiqiang Yi
The Department of Neurosurgery, Peking University First Hospital, Beijing, China.
S
Shiqi Wang
Department of Computer Science, City University of Hong Kong, Hong Kong SAR, China.
G
Gaofeng Meng
State Key Laboratory of Multimodal Artificial Intelligence Systems, Institute of Automation, Chinese Academy of Sciences, Beijing, China; Centre for Artificial Intelligence and Robotics, Hong Kong Institute of Science and Innovation, Chinese Academy of Sciences, Hong Kong SAR, China; School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing, China.
Zhaoxiang Zhang
Zhaoxiang Zhang
Institute of Automation, Chinese Academy of Sciences
Computer VisionPattern RecognitionBiologically-inspired Learning