SHAKE-GNN: Scalable Hierarchical Kirchhoff-Forest Graph Neural Network

📅 2025-09-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Scalability and high computational cost hinder existing graph neural networks (GNNs) on large-scale graph-level learning tasks. This paper proposes SHAKE-GNN, a scalable framework featuring: (i) a randomized multi-resolution graph decomposition mechanism grounded in Kirchhoff forests, enabling hierarchical, data-adaptive graph coarsening; and (ii) a lightweight hierarchical GNN architecture coupled with a data-driven parameter selection strategy, facilitating flexible trade-offs between model capacity and computational efficiency. Theoretical analysis establishes its linear time complexity. On multiple large-scale graph classification benchmarks, SHAKE-GNN achieves performance on par with or superior to state-of-the-art baselines—while reducing memory consumption by 41% and accelerating inference by 2.3× on average. The framework thus delivers both strong scalability and high accuracy under significantly lower resource overhead.

Technology Category

Application Category

📝 Abstract
Graph Neural Networks (GNNs) have achieved remarkable success across a range of learning tasks. However, scaling GNNs to large graphs remains a significant challenge, especially for graph-level tasks. In this work, we introduce SHAKE-GNN, a novel scalable graph-level GNN framework based on a hierarchy of Kirchhoff Forests, a class of random spanning forests used to construct stochastic multi-resolution decompositions of graphs. SHAKE-GNN produces multi-scale representations, enabling flexible trade-offs between efficiency and performance. We introduce an improved, data-driven strategy for selecting the trade-off parameter and analyse the time-complexity of SHAKE-GNN. Experimental results on multiple large-scale graph classification benchmarks demonstrate that SHAKE-GNN achieves competitive performance while offering improved scalability.
Problem

Research questions and friction points this paper is trying to address.

Scaling graph neural networks to large graphs
Addressing scalability challenges in graph-level tasks
Enabling flexible efficiency-performance trade-offs in GNNs
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Kirchhoff Forests for multi-resolution graph decompositions
Generates multi-scale representations for efficiency-performance trade-offs
Implements data-driven parameter selection for improved scalability
🔎 Similar Papers
No similar papers found.