Online Continual Graph Learning

📅 2025-08-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the lack of formal definition and systematic investigation into online continual learning for dynamic graph data streams, this paper introduces the first formal characterization of “online continual graph learning,” emphasizing topology-aware batch efficiency and real-time prediction capability. We propose a unified framework that tightly integrates graph neural networks with continual learning mechanisms, enabling incremental task learning under non-i.i.d. graph stream distributions while mitigating catastrophic forgetting. Furthermore, we construct the first standardized benchmark—comprising diverse real-world graph streams and a unified evaluation protocol—to rigorously assess mainstream continual learning methods in graph streaming settings. This work bridges critical gaps in theoretical modeling, algorithmic design, and empirical evaluation, establishing foundational principles for sustainable online graph learning.

Technology Category

Application Category

📝 Abstract
The aim of Continual Learning (CL) is to learn new tasks incrementally while avoiding catastrophic forgetting. Online Continual Learning (OCL) specifically focuses on learning efficiently from a continuous stream of data with shifting distribution. While recent studies explore Continual Learning on graphs exploiting Graph Neural Networks (GNNs), only few of them focus on a streaming setting. Yet, many real-world graphs evolve over time, often requiring timely and online predictions. Current approaches, however, are not well aligned with the standard OCL setting, partly due to the lack of a clear definition of online Continual Learning on graphs. In this work, we propose a general formulation for online Continual Learning on graphs, emphasizing the efficiency requirements on batch processing over the graph topology, and providing a well-defined setting for systematic model evaluation. Finally, we introduce a set of benchmarks and report the performance of several methods in the CL literature, adapted to our setting.
Problem

Research questions and friction points this paper is trying to address.

Defining online continual learning for evolving graphs
Addressing catastrophic forgetting in graph neural networks
Evaluating models in streaming graph learning scenarios
Innovation

Methods, ideas, or system contributions that make the work stand out.

Proposes online Continual Learning for evolving graphs
Emphasizes efficient batch processing on graph topology
Introduces benchmarks for systematic model evaluation
🔎 Similar Papers
No similar papers found.