🤖 AI Summary
Continual learning for non-stationary visual data streams in computer graphics—requiring dynamic adaptation to emerging visual patterns while mitigating catastrophic forgetting—remains challenging.
Method: This paper proposes an online visual prototype learning framework featuring two novel mechanisms: Ancestral Prototype Adaptation (APA), which preserves knowledge continuity through inheritance and evolutionary refinement of ancestral prototypes; and Mamba Feedback (MF), which leverages the Mamba architecture to model temporal dependencies and selectively refine prototypes of challenging classes. The method integrates online prototype clustering, selective discriminative subspace projection, and feedback-driven prototype refinement.
Contribution/Results: Evaluated on CIFAR-10 and CIFAR-100, the framework achieves significant improvements over state-of-the-art continual learning methods: higher accuracy, reduced forgetting by over 40%, and superior balance between dynamic adaptability and model stability.
📝 Abstract
In the realm of computer graphics, the ability to learn continuously from non-stationary data streams while adapting to new visual patterns and mitigating catastrophic forgetting is of paramount importance. Existing approaches often struggle to capture and represent the essential characteristics of evolving visual concepts, hindering their applicability to dynamic graphics tasks. In this paper, we propose Ancestral Mamba, a novel approach that integrates online prototype learning into a selective discriminant space model for efficient and robust online continual learning. The key components of our approach include Ancestral Prototype Adaptation (APA), which continuously refines and builds upon learned visual prototypes, and Mamba Feedback (MF), which provides targeted feedback to adapt to challenging visual patterns. APA enables the model to continuously adapt its prototypes, building upon ancestral knowledge to tackle new challenges, while MF acts as a targeted feedback mechanism, focusing on challenging classes and refining their representations. Extensive experiments on graphics-oriented datasets, such as CIFAR-10 and CIFAR-100, demonstrate the superior performance of Ancestral Mamba compared to state-of-the-art baselines, achieving significant improvements in accuracy and forgetting mitigation.