Bayesian Learning-driven Prototypical Contrastive Loss for Class-Incremental Learning

📅 2024-05-17
🏛️ arXiv.org
📈 Citations: 4
Influential: 0
📄 PDF
🤖 AI Summary
To address catastrophic forgetting in class-incremental learning, this paper proposes a Bayesian-driven prototype contrastive learning framework. The method introduces Bayesian uncertainty modeling into the prototype contrastive loss for the first time, enabling dynamic, adaptive weighting between cross-entropy and contrastive losses. It establishes a prototype-level contrastive learning paradigm tailored to incremental settings, jointly enforcing intra-class compactness and inter-class separability in the latent space to co-optimize representations of both old and new class prototypes. Evaluated on CIFAR-10, CIFAR-100, and a GNSS interference classification dataset, the approach consistently outperforms state-of-the-art methods, achieving simultaneous improvements in both final accuracy and forgetting rate. These results demonstrate its effectiveness in enhancing representation robustness and sustaining discriminative capability across incremental tasks.

Technology Category

Application Category

📝 Abstract
The primary objective of methods in continual learning is to learn tasks in a sequential manner over time from a stream of data, while mitigating the detrimental phenomenon of catastrophic forgetting. In this paper, we focus on learning an optimal representation between previous class prototypes and newly encountered ones. We propose a prototypical network with a Bayesian learning-driven contrastive loss (BLCL) tailored specifically for class-incremental learning scenarios. Therefore, we introduce a contrastive loss that incorporates new classes into the latent representation by reducing the intra-class distance and increasing the inter-class distance. Our approach dynamically adapts the balance between the cross-entropy and contrastive loss functions with a Bayesian learning technique. Empirical evaluations conducted on both the CIFAR-10 and CIFAR-100 dataset for image classification and images of a GNSS-based dataset for interference classification validate the efficacy of our method, showcasing its superiority over existing state-of-the-art approaches.
Problem

Research questions and friction points this paper is trying to address.

Mitigate catastrophic forgetting in sequential task learning
Learn effective representation between old and new class prototypes
Dynamically balance cross-entropy and contrastive loss functions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bayesian learning-driven contrastive loss for representation
Dynamic balance between cross-entropy and contrastive loss
Prototypical network for class-incremental learning scenarios
🔎 Similar Papers
No similar papers found.
N
N. Raichur
Fraunhofer Institute for Integrated Circuits (IIS), Nürnberg, Germany
L
Lucas Heublein
Fraunhofer Institute for Integrated Circuits (IIS), Nürnberg, Germany
Tobias Feigl
Tobias Feigl
Fraunhofer IIS
AIGNSS InterferenceHCIIndoor NavigationIndoor Positioning
A
A. Rügamer
Fraunhofer Institute for Integrated Circuits (IIS), Nürnberg, Germany
Christopher Mutschler
Christopher Mutschler
Division Director Positioning and Networks, Fraunhofer IIS
Reinforcement LearningMachine LearningIndoor PositioningIndoor Navigation
Felix Ott
Felix Ott
Program Manager Signals Intelligence, Fraunhofer IIS
Representation LearningFederated LearningSignals AnalysisTime SeriesGNSS