Learn by Reasoning: Analogical Weight Generation for Few-Shot Class-Incremental Learning

📅 2025-03-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenges in few-shot class-incremental learning (FSCIL)—namely, extremely limited samples for novel classes, the need for parameter fine-tuning, and catastrophic forgetting of old knowledge—this paper proposes BiAG, a brain-inspired analogical weight generation framework that requires no parameter updates. Methodologically, BiAG introduces: (1) an analogical generator that formalizes human analogical reasoning as cross-class semantic mapping; (2) Weight Self-Attention (WSA) and Weight & Prototype Analogical Attention (WPAA), enabling zero-shot, fine-tuning-free generation of novel-class classifiers; and (3) a Semantic Conversion Module (SCM) guided by neural collapse theory to enhance feature discriminability and generalization. Evaluated on miniImageNet, CUB-200, and CIFAR-100, BiAG significantly outperforms state-of-the-art methods, achieving substantial improvements in both final accuracy and average incremental accuracy.

Technology Category

Application Category

📝 Abstract
Few-shot class-incremental Learning (FSCIL) enables models to learn new classes from limited data while retaining performance on previously learned classes. Traditional FSCIL methods often require fine-tuning parameters with limited new class data and suffer from a separation between learning new classes and utilizing old knowledge. Inspired by the analogical learning mechanisms of the human brain, we propose a novel analogical generative method. Our approach includes the Brain-Inspired Analogical Generator (BiAG), which derives new class weights from existing classes without parameter fine-tuning during incremental stages. BiAG consists of three components: Weight Self-Attention Module (WSA), Weight&Prototype Analogical Attention Module (WPAA), and Semantic Conversion Module (SCM). SCM uses Neural Collapse theory for semantic conversion, WSA supplements new class weights, and WPAA computes analogies to generate new class weights. Experiments on miniImageNet, CUB-200, and CIFAR-100 datasets demonstrate that our method achieves higher final and average accuracy compared to SOTA methods.
Problem

Research questions and friction points this paper is trying to address.

FSCIL learns new classes with limited data while preserving old class performance
Traditional FSCIL struggles with knowledge integration and parameter fine-tuning
Proposes brain-inspired analogical generation for weight creation without fine-tuning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Brain-Inspired Analogical Generator (BiAG) for weight generation
No parameter fine-tuning during incremental learning
Neural Collapse theory for semantic conversion
🔎 Similar Papers
No similar papers found.