SAILS: Segment Anything with Incrementally Learned Semantics for Task-Invariant and Training-Free Continual Learning

πŸ“… 2026-02-16
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work proposes the first entirely training-free continual learning framework for class-incremental semantic segmentation, addressing the high computational cost and catastrophic forgetting inherent in existing approaches. The method decouples the task into two stages: zero-shot region extraction leveraging the Segment Anything Model, followed by prototype-based semantic association. Within a fixed feature space, it models semantic concepts through multi-prototype selective intra-class clustering. By eliminating the need for model updates, the framework completely avoids catastrophic forgetting and exhibits both forward and backward knowledge transfer. Remarkably, it outperforms most training-based methods on standard CISS benchmarks, demonstrating particularly stable performance over long task sequences.

Technology Category

Application Category

πŸ“ Abstract
Continual learning remains constrained by the need for repeated retraining, high computational costs, and the persistent challenge of forgetting. These factors significantly limit the applicability of continuous learning in real-world settings, as iterative model updates require significant computational resources and inherently exacerbate forgetting. We present SAILS -- Segment Anything with Incrementally Learned Semantics, a training-free framework for Class-Incremental Semantic Segmentation (CISS) that sidesteps these challenges entirely. SAILS leverages foundational models to decouple CISS into two stages: Zero-shot region extraction using Segment Anything Model (SAM), followed by semantic association through prototypes in a fixed feature space. SAILS incorporates selective intra-class clustering, resulting in multiple prototypes per class to better model intra-class variability. Our results demonstrate that, despite requiring no incremental training, SAILS typically surpasses the performance of existing training-based approaches on standard CISS datasets, particularly in long and challenging task sequences where forgetting tends to be most severe. By avoiding parameter updates, SAILS completely eliminates forgetting and maintains consistent, task-invariant performance. Furthermore, SAILS exhibits positive backward transfer, where the introduction of new classes can enhance performance on previous classes.
Problem

Research questions and friction points this paper is trying to address.

continual learning
class-incremental semantic segmentation
catastrophic forgetting
training-free
task-invariant
Innovation

Methods, ideas, or system contributions that make the work stand out.

training-free continual learning
class-incremental semantic segmentation
Segment Anything Model (SAM)
prototype-based semantic association
catastrophic forgetting elimination
πŸ”Ž Similar Papers
No similar papers found.