🤖 AI Summary
MPNNs suffer from degraded node classification performance due to inconsistent neighborhood class distributions—not merely low graph homophily—especially when same-class nodes exhibit multimodal neighborhood distributions, causing biased message aggregation. To address this, we propose a differentiable graph rewiring method based on Gumbel-Softmax relaxation, the first to apply Gumbel-Softmax to discrete graph structure optimization for end-to-end learning of neighborhood distribution consistency. Our approach mitigates over-smoothing and over-squashing, enhances long-range dependency modeling, and improves both neighborhood informativeness and graph connectivity. Extensive experiments on multiple benchmark datasets demonstrate significant gains in MPNN classification accuracy. Moreover, the learned graph structures yield more discriminative and structurally sound neighborhoods, as validated by quantitative metrics assessing neighborhood class distribution coherence and topological rationality.
📝 Abstract
Graph homophily has been considered an essential property for message-passing neural networks (MPNN) in node classification. Recent findings suggest that performance is more closely tied to the consistency of neighborhood class distributions. We demonstrate that the MPNN performance depends on the number of components of the overall neighborhood distribution within a class. By breaking down the classes into their neighborhood distribution components, we increase measures of neighborhood distribution informativeness but do not observe an improvement in MPNN performance. We propose a Gumbel-Softmax-based rewiring method that reduces deviations in neighborhood distributions. Our results show that our new method enhances neighborhood informativeness, handles long-range dependencies, mitigates oversquashing, and increases the classification performance of the MPNN. The code is available at https://github.com/Bobowner/Gumbel-Softmax-MPNN.