Simple and Efficient Heterogeneous Temporal Graph Neural Network

📅 2025-10-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing heterogeneous temporal graph neural networks (HTGNNs) commonly adopt a decoupled spatiotemporal modeling paradigm, resulting in weak spatiotemporal interaction and high model complexity. To address this, we propose SE-HTGNN—a lightweight, end-to-end framework that tightly integrates spatial and temporal information. First, we design a history-guided dynamic attention mechanism that explicitly encodes temporal dependencies within spatial message passing. Second, we incorporate large language models (LLMs) via prompt learning to extract semantic priors of node types, thereby enhancing structural awareness. SE-HTGNN achieves state-of-the-art prediction accuracy while accelerating inference by up to 10× and significantly reducing computational overhead. Its core innovations lie in (i) internalizing temporal modeling into the spatial learning process, and (ii) the first systematic integration of LLM-derived semantic priors into heterogeneous temporal graph representation learning.

Technology Category

Application Category

📝 Abstract
Heterogeneous temporal graphs (HTGs) are ubiquitous data structures in the real world. Recently, to enhance representation learning on HTGs, numerous attention-based neural networks have been proposed. Despite these successes, existing methods rely on a decoupled temporal and spatial learning paradigm, which weakens interactions of spatio-temporal information and leads to a high model complexity. To bridge this gap, we propose a novel learning paradigm for HTGs called Simple and Efficient Heterogeneous Temporal Graph N}eural Network (SE-HTGNN). Specifically, we innovatively integrate temporal modeling into spatial learning via a novel dynamic attention mechanism, which retains attention information from historical graph snapshots to guide subsequent attention computation, thereby improving the overall discriminative representations learning of HTGs. Additionally, to comprehensively and adaptively understand HTGs, we leverage large language models to prompt SE-HTGNN, enabling the model to capture the implicit properties of node types as prior knowledge. Extensive experiments demonstrate that SE-HTGNN achieves up to 10x speed-up over the state-of-the-art and latest baseline while maintaining the best forecasting accuracy.
Problem

Research questions and friction points this paper is trying to address.

Addresses decoupled spatio-temporal learning in heterogeneous graphs
Reduces model complexity while improving discriminative representation
Integrates temporal modeling with spatial learning via dynamic attention
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integrated temporal modeling via dynamic attention mechanism
Leveraged large language models for adaptive node understanding
Achieved significant speed-up while maintaining forecasting accuracy
🔎 Similar Papers
2024-05-07IEEE Transactions on Neural Networks and Learning SystemsCitations: 2
Yili Wang
Yili Wang
Jilin University
Graph Neural Networks
Tairan Huang
Tairan Huang
Beihang University
cv
C
Changlong He
Central South University
Q
Qiutong Li
Central South University
J
Jianliang Gao
Central South University