TGB-Seq Benchmark: Challenging Temporal GNNs with Complex Sequential Dynamics

📅 2025-02-05
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing temporal graph neural networks (GNNs) and benchmarks overemphasize repeated edge prediction, neglecting dynamic user behavior sequences (e.g., “follow OpenAI → Anthropic → Meta AI”), thereby failing to capture real-world temporal logic. To address this, we introduce TGB-Seq—the first temporal graph benchmark explicitly designed to evaluate complex sequential dynamics. Our work systematically reveals that mainstream temporal GNNs cannot learn fundamental sequence patterns; establishes a data construction paradigm prioritizing low edge repetition and high sequentiality; encompasses six large-scale, real-world dynamic graphs spanning diverse domains; and proposes a sequence-sensitive evaluation protocol with standardized training and testing frameworks. Experiments show that state-of-the-art models—including GraphMixer and DyGFormer—exhibit substantial performance degradation and sharply increased training costs on TGB-Seq, confirming its strong challenge. All datasets, code, and an online leaderboard are publicly released.

Technology Category

Application Category

📝 Abstract
Future link prediction is a fundamental challenge in various real-world dynamic systems. To address this, numerous temporal graph neural networks (temporal GNNs) and benchmark datasets have been developed. However, these datasets often feature excessive repeated edges and lack complex sequential dynamics, a key characteristic inherent in many real-world applications such as recommender systems and ``Who-To-Follow'' on social networks. This oversight has led existing methods to inadvertently downplay the importance of learning sequential dynamics, focusing primarily on predicting repeated edges. In this study, we demonstrate that existing methods, such as GraphMixer and DyGFormer, are inherently incapable of learning simple sequential dynamics, such as ``a user who has followed OpenAI and Anthropic is more likely to follow AI at Meta next.'' Motivated by this issue, we introduce the Temporal Graph Benchmark with Sequential Dynamics (TGB-Seq), a new benchmark carefully curated to minimize repeated edges, challenging models to learn sequential dynamics and generalize to unseen edges. TGB-Seq comprises large real-world datasets spanning diverse domains, including e-commerce interactions, movie ratings, business reviews, social networks, citation networks and web link networks. Benchmarking experiments reveal that current methods usually suffer significant performance degradation and incur substantial training costs on TGB-Seq, posing new challenges and opportunities for future research. TGB-Seq datasets, leaderboards, and example codes are available at https://tgb-seq.github.io/.
Problem

Research questions and friction points this paper is trying to address.

Addresses future link prediction
Challenges temporal GNNs
Minimizes repeated edges
Innovation

Methods, ideas, or system contributions that make the work stand out.

Introduces TGB-Seq benchmark
Minimizes repeated edges
Challenges learning sequential dynamics
🔎 Similar Papers
No similar papers found.