Topologic Attention Networks: Attending to Direct and Indirect Neighbors through Gaussian Belief Propagation

📅 2025-11-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Graph Neural Networks (GNNs) are inherently limited by local message passing, hindering effective modeling of long-range dependencies; existing extensions—such as continuous-time dynamics or fully connected self-attention—often suffer from high computational cost and poor scalability. To address this, we propose Topo-Attention, a topology-aware attention mechanism that abandons explicit pairwise node attention. Instead, it models probabilistic information flow across both direct and indirect neighbors via Gaussian belief propagation, unifying local and global relational modeling. By embedding graph structural priors into a learnable, linear-time information propagation scheme, Topo-Attention enables efficient and scalable attention computation without sacrificing expressivity. Evaluated on multiple benchmark datasets, it achieves state-of-the-art performance in graph representation learning, significantly outperforming prior methods. Moreover, its inference overhead is substantially lower than that of standard attention-based baselines, demonstrating superior efficiency and scalability.

Technology Category

Application Category

📝 Abstract
Graph Neural Networks rely on local message passing, which limits their ability to model long-range dependencies in graphs. Existing approaches extend this range through continuous-time dynamics or dense self-attention, but both suffer from high computational cost and limited scalability. We propose Topologic Attention Networks, a new framework that applies topologic attention, a probabilistic mechanism that learns how information should flow through both direct and indirect connections in a graph. Unlike conventional attention that depends on explicit pairwise interactions, topologic attention emerges from the learned information propagation of the graph, enabling unified reasoning over local and global relationships. This method achieves provides state-of-the-art performance across all measured baseline models. Our implementation is available at https://github.com/Marshall-Rosenhoover/Topologic-Attention-Networks.
Problem

Research questions and friction points this paper is trying to address.

Modeling long-range dependencies in graphs using local message passing
Reducing computational cost and improving scalability of graph attention
Unifying reasoning over local and global graph relationships through probabilistic attention
Innovation

Methods, ideas, or system contributions that make the work stand out.

Topologic Attention Networks use probabilistic attention mechanism
Learns information flow through direct and indirect connections
Enables unified reasoning over local and global relationships
🔎 Similar Papers
No similar papers found.
M
Marshall Rosenhoover
Department of Computer Science, University of Alabama in Huntsville
Huaming Zhang
Huaming Zhang
University of Alabama in Huntsville