Unlearning-based sliding window for continual learning under concept drift

📅 2026-03-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of concept drift in task-agnostic continual learning, where models must efficiently adapt to new data while discarding outdated information. Traditional sliding window approaches incur substantial computational overhead due to frequent full retraining. To overcome this limitation, the paper introduces machine unlearning into the concept drift setting for the first time, proposing an unlearning-based sliding window mechanism that selectively removes the influence of obsolete samples during window updates and incrementally refines the model without repeated full retraining. The method enables online learning on image data streams without explicit task boundaries and achieves competitive accuracy across multiple concept drift classification benchmarks while significantly reducing computational cost compared to standard sliding window retraining strategies.

Technology Category

Application Category

📝 Abstract
Traditional machine learning assumes a stationary data distribution, yet many real-world applications operate on nonstationary streams in which the underlying concept evolves over time. This problem can also be viewed as task-free continual learning under concept drift, where a model must adapt sequentially without explicit task identities or task boundaries. In such settings, effective learning requires both rapid adaptation to new data and forgetting of outdated information. A common solution is based on a sliding window, but this approach is often computationally demanding because the model must be repeatedly retrained from scratch on the most recent data. We propose a different perspective based on machine unlearning. Instead of rebuilding the model each time the active window changes, we remove the influence of outdated samples using unlearning and then update the model with newly observed data. This enables efficient, targeted forgetting while preserving adaptation to evolving distributions. To the best of our knowledge, this is the first work to connect machine unlearning with concept drift mitigation for task-free continual learning. Empirical results on image stream classification across multiple drift scenarios demonstrate that the proposed approach offers a competitive and computationally efficient alternative to standard sliding-window retraining. Our implementation can be found at \hrehttps://anonymous.4open.science/r/MUNDataStream-60F3}{https://anonymous.4open.science/r/MUNDataStream-60F3}.
Problem

Research questions and friction points this paper is trying to address.

concept drift
continual learning
machine unlearning
sliding window
nonstationary data
Innovation

Methods, ideas, or system contributions that make the work stand out.

machine unlearning
concept drift
continual learning
sliding window
task-free learning
🔎 Similar Papers
No similar papers found.