Instance-Prototype Affinity Learning for Non-Exemplar Continual Graph Learning

📅 2025-05-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address catastrophic forgetting caused by feature drift in non-exemplar continual graph learning (NECGL), this paper proposes an instance-prototype affinity learning framework. Methodologically, it introduces two core innovations: (1) Topology-Integrated Gaussian Prototypes (TIGP), which incorporates graph structural priors into Gaussian mixture prototype modeling to enhance prototype robustness against distribution shifts; and (2) Instance-Prototype Affinity Distillation (IPAD), which jointly leverages topology-aware regularization and decision-boundary awareness to ensure stable knowledge transfer across tasks. Crucially, the approach operates without storing raw samples, thereby circumventing privacy concerns and memory bottlenecks inherent in exemplar-based methods. Extensive evaluation on four node classification benchmarks demonstrates consistent superiority over state-of-the-art methods, achieving the optimal trade-off between plasticity and stability in continual graph learning.

Technology Category

Application Category

📝 Abstract
Graph Neural Networks (GNN) endure catastrophic forgetting, undermining their capacity to preserve previously acquired knowledge amid the assimilation of novel information. Rehearsal-based techniques revisit historical examples, adopted as a principal strategy to alleviate this phenomenon. However, memory explosion and privacy infringements impose significant constraints on their utility. Non-Exemplar methods circumvent the prior issues through Prototype Replay (PR), yet feature drift presents new challenges. In this paper, our empirical findings reveal that Prototype Contrastive Learning (PCL) exhibits less pronounced drift than conventional PR. Drawing upon PCL, we propose Instance-Prototype Affinity Learning (IPAL), a novel paradigm for Non-Exemplar Continual Graph Learning (NECGL). Exploiting graph structural information, we formulate Topology-Integrated Gaussian Prototypes (TIGP), guiding feature distributions towards high-impact nodes to augment the model's capacity for assimilating new knowledge. Instance-Prototype Affinity Distillation (IPAD) safeguards task memory by regularizing discontinuities in class relationships. Moreover, we embed a Decision Boundary Perception (DBP) mechanism within PCL, fostering greater inter-class discriminability. Evaluations on four node classification benchmark datasets demonstrate that our method outperforms existing state-of-the-art methods, achieving a better trade-off between plasticity and stability.
Problem

Research questions and friction points this paper is trying to address.

Address catastrophic forgetting in Graph Neural Networks
Overcome memory and privacy issues in rehearsal-based methods
Mitigate feature drift in Non-Exemplar Continual Graph Learning
Innovation

Methods, ideas, or system contributions that make the work stand out.

Instance-Prototype Affinity Learning for continual graph learning
Topology-Integrated Gaussian Prototypes guide feature distributions
Decision Boundary Perception enhances inter-class discriminability
🔎 Similar Papers
No similar papers found.
L
Lei Song
Jiangsu Provincial Joint International Research Laboratory of Medical Information Processing, School of Computer Science and Engineering, Southeast University
J
Jiaxing Li
Jiangsu Provincial Joint International Research Laboratory of Medical Information Processing, School of Computer Science and Engineering, Southeast University
Shihan Guan
Shihan Guan
Southeast University & University of Rennes P.h.d
Deep learningGraph Neural networksEpilepsy
Youyong Kong
Youyong Kong
Associate Professor at School of Computer Science and Engineering, Southeast University
medical image processingmachine learningbrain network analysis