Renormalized Graph Neural Networks

📅 2023-06-01
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
To address the expressive limitations and scale sensitivity of Graph Neural Networks (GNNs) in modeling complex systems, this paper introduces Renormalization Group (RG) theory into GNN design for the first time, proposing an RG-driven differentiable graph coarsening framework. The method enables scale-adaptive message passing via Laplacian spectral-domain graph rewiring, relaxing the conventional fixed-topology assumption and enhancing scale invariance and generalization. Key contributions include: (i) the first formal theoretical linkage between RG theory and GNNs; and (ii) an end-to-end differentiable graph rewiring paradigm. Evaluated on standard benchmarks—including Cora, Citeseer, PubMed, and OGB—the model consistently outperforms GCN, GAT, and GraphSAGE. Notably, it achieves absolute accuracy gains of 5.2–9.7% in few-shot learning and cross-scale transfer tasks, demonstrating superior robustness to structural scale variations.
📝 Abstract
Graph Neural Networks (GNNs) have become essential for studying complex data, particularly when represented as graphs. Their value is underpinned by their ability to reflect the intricacies of numerous areas, ranging from social to biological networks. GNNs can grapple with non-linear behaviors, emerging patterns, and complex connections; these are also typical characteristics of complex systems. The renormalization group (RG) theory has emerged as the language for studying complex systems. It is recognized as the preferred lens through which to study complex systems, offering a framework that can untangle their intricate dynamics. Despite the clear benefits of integrating RG theory with GNNs, no existing methods have ventured into this promising territory. This paper proposes a new approach that applies RG theory to devise a novel graph rewiring to improve GNNs' performance on graph-related tasks. We support our proposal with extensive experiments on standard benchmarks and baselines. The results demonstrate the effectiveness of our method and its potential to remedy the current limitations of GNNs. Finally, this paper marks the beginning of a new research direction. This path combines the theoretical foundations of RG, the magnifying glass of complex systems, with the structural capabilities of GNNs. By doing so, we aim to enhance the potential of GNNs in modeling and unraveling the complexities inherent in diverse systems.
Problem

Research questions and friction points this paper is trying to address.

Analyzing coarse-grained graph resolutions for node classification
Exploring Laplacian renormalization group theory's impact on graphs
Improving test accuracy with multi-scale graph representations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Laplacian renormalization group theory for graph resolution
Multi-scale graph representations for node classification
Combining original and characteristic scale graphs
🔎 Similar Papers
No similar papers found.