Brain-inspired analogical mixture prototypes for few-shot class-incremental learning

📅 2025-02-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Few-shot class-incremental learning (FSCIL) suffers from the dual challenges of catastrophic forgetting of old classes and overfitting to novel classes. Method: This paper proposes a brain-inspired, analogy-driven prototypical learning framework that integrates statistical analogy into hybrid prototype modeling. It introduces a statistical analogy-based calibration strategy for novel-class prototype distributions and a prototype covariance adaptation mechanism to mitigate knowledge forgetting. Furthermore, it employs Mahalanobis distance metric learning and soft-voting ensemble to enhance discriminative robustness, with end-to-end optimization built upon fine-tuned Vision Transformers. Contribution/Results: The method achieves state-of-the-art performance across multiple FSCIL benchmarks, significantly improving the balance between accuracy on novel and base classes. Experimental results validate the effectiveness and generalization advantage of analogy-driven prototypical modeling for few-shot continual learning.

Technology Category

Application Category

📝 Abstract
Few-shot class-incremental learning (FSCIL) poses significant challenges for artificial neural networks due to the need to efficiently learn from limited data while retaining knowledge of previously learned tasks. Inspired by the brain's mechanisms for categorization and analogical learning, we propose a novel approach called Brain-inspired Analogical Mixture Prototypes (BAMP). BAMP has three components: mixed prototypical feature learning, statistical analogy, and soft voting. Starting from a pre-trained Vision Transformer (ViT), mixed prototypical feature learning represents each class using a mixture of prototypes and fine-tunes these representations during the base session. The statistical analogy calibrates the mean and covariance matrix of prototypes for new classes according to similarity to the base classes, and computes classification score with Mahalanobis distance. Soft voting combines both merits of statistical analogy and an off-shelf FSCIL method. Our experiments on benchmark datasets demonstrate that BAMP outperforms state-of-the-art on both traditional big start FSCIL setting and challenging small start FSCIL setting. The study suggests that brain-inspired analogical mixture prototypes can alleviate catastrophic forgetting and over-fitting problems in FSCIL.
Problem

Research questions and friction points this paper is trying to address.

Few-shot class-incremental learning challenges
Brain-inspired analogical mixture prototypes
Alleviates catastrophic forgetting and over-fitting
Innovation

Methods, ideas, or system contributions that make the work stand out.

Brain-inspired Analogical Mixture Prototypes
Mixed prototypical feature learning
Statistical analogy and soft voting
🔎 Similar Papers
No similar papers found.
Wanyi Li
Wanyi Li
State Key Laboratory of Multimodal Artificial Intelligence Systems, Institute of Automation, Chinese Academy of Sciences, Beijing 100190, China
W
Wei Wei
State Key Laboratory of Multimodal Artificial Intelligence Systems, Institute of Automation, Chinese Academy of Sciences, Beijing 100190, China, and also with the School of Artificial Intelligence, University of Chinese Academy of Sciences, Beijing 100049, China
Yongkang Luo
Yongkang Luo
Institute of Automation, Chinese Academy of Sciences
Robot LearningDexterous ManipulationIntelligence SystemComputer Vision
P
Peng Wang
State Key Laboratory of Multimodal Artificial Intelligence Systems, Institute of Automation, Chinese Academy of Sciences, Beijing 100190, China, with the CAS Center for Excellence in Brain Science and Intelligence Technology, Chinese Academy of Sciences, Shanghai 200031, China, and also with the Centre for Artificial Intelligence and Robotics, Hong Kong Institute of Science and Innovation, Chinese Academy of Sciences, Hong Kong 999077, Hong Kong