Analytic Drift Resister for Non-Exemplar Continual Graph Learning

📅 2026-04-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the feature drift caused by retaining only class prototypes and the limited plasticity of analytic continual learning models in exemplar-free continual graph learning. To overcome these limitations, the authors propose the Analytic Drift Resister (ADR) framework, which enhances model plasticity by iteratively applying backpropagation to circumvent the typical freezing of pretrained models. ADR further integrates hierarchical analytic consolidation with an analytically reconstructed classifier to fully suppress feature drift and achieve theoretically zero forgetting. As the first method to attain zero forgetting in exemplar-free continual graph learning, ADR matches or surpasses state-of-the-art performance on four node classification benchmarks, effectively balancing stability against forgetting with adaptability to new tasks.
📝 Abstract
Non-Exemplar Continual Graph Learning (NECGL) seeks to eliminate the privacy risks intrinsic to rehearsal-based paradigms by retaining solely class-level prototype representations rather than raw graph examples for mitigating catastrophic forgetting. However, this design choice inevitably precipitates feature drift. As a nascent alternative, Analytic Continual Learning (ACL) capitalizes on the intrinsic generalization properties of frozen pre-trained models to bolster continual learning performance. Nonetheless, a key drawback resides in the pronounced attenuation of model plasticity. To surmount these challenges, we propose Analytic Drift Resister (ADR), a novel and theoretically grounded NECGL framework. ADR exploits iterative backpropagation to break free from the frozen pre-trained constraint, adapting to evolving task graph distributions and fortifying model plasticity. Since parameter updates trigger feature drift, we further propose Hierarchical Analytic Merging (HAM), performing layer-wise merging of linear transformations in Graph Neural Networks (GNNs) via ridge regression, thereby ensuring absolute resistance to feature drift. On this basis, Analytic Classifier Reconstruction (ACR) enables theoretically zero-forgetting class-incremental learning. Empirical evaluation on four node classification benchmarks demonstrates that ADR maintains strong competitiveness against existing state-of-the-art methods.
Problem

Research questions and friction points this paper is trying to address.

Non-Exemplar Continual Graph Learning
feature drift
catastrophic forgetting
model plasticity
Graph Neural Networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Analytic Drift Resister
Non-Exemplar Continual Graph Learning
Hierarchical Analytic Merging
Feature Drift Resistance
Zero-Forgetting Learning
🔎 Similar Papers
No similar papers found.