🤖 AI Summary
Graph Neural Networks (GNNs) are inherently limited by local message passing, hindering effective modeling of long-range dependencies; existing extensions—such as continuous-time dynamics or fully connected self-attention—often suffer from high computational cost and poor scalability. To address this, we propose Topo-Attention, a topology-aware attention mechanism that abandons explicit pairwise node attention. Instead, it models probabilistic information flow across both direct and indirect neighbors via Gaussian belief propagation, unifying local and global relational modeling. By embedding graph structural priors into a learnable, linear-time information propagation scheme, Topo-Attention enables efficient and scalable attention computation without sacrificing expressivity. Evaluated on multiple benchmark datasets, it achieves state-of-the-art performance in graph representation learning, significantly outperforming prior methods. Moreover, its inference overhead is substantially lower than that of standard attention-based baselines, demonstrating superior efficiency and scalability.
📝 Abstract
Graph Neural Networks rely on local message passing, which limits their ability to model long-range dependencies in graphs. Existing approaches extend this range through continuous-time dynamics or dense self-attention, but both suffer from high computational cost and limited scalability. We propose Topologic Attention Networks, a new framework that applies topologic attention, a probabilistic mechanism that learns how information should flow through both direct and indirect connections in a graph. Unlike conventional attention that depends on explicit pairwise interactions, topologic attention emerges from the learned information propagation of the graph, enabling unified reasoning over local and global relationships. This method achieves provides state-of-the-art performance across all measured baseline models. Our implementation is available at https://github.com/Marshall-Rosenhoover/Topologic-Attention-Networks.