Mixture of Decoupled Message Passing Experts with Entropy Constraint for General Node Classification

📅 2025-02-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Real-world graphs exhibit both homophily and heterophily, limiting the generalization capability of existing GNNs for node classification. To address this, we propose GNNMoE—a generic node classification framework based on Mixture of Experts (MoE). GNNMoE decouples message passing from feature transformation to construct diverse, specialized experts. It introduces an entropy-constrained dual-gating mechanism that jointly employs soft weighting and hard Top-K selection for node-adaptive expert assignment. Additionally, entropy-regularized gating sharpening is incorporated to enhance decision robustness. This work is the first to integrate decoupled message passing with the MoE architecture. Extensive experiments on diverse homophilic and heterophilic graph benchmarks demonstrate that GNNMoE consistently outperforms state-of-the-art GNNs, heterophilic GNNs, and graph Transformers—achieving both superior accuracy and strong generalization across heterogeneous graph structures.

Technology Category

Application Category

📝 Abstract
The varying degrees of homophily and heterophily in real-world graphs persistently constrain the universality of graph neural networks (GNNs) for node classification. Adopting a data-centric perspective, this work reveals an inherent preference of different graphs towards distinct message encoding schemes: homophilous graphs favor local propagation, while heterophilous graphs exhibit preference for flexible combinations of propagation and transformation. To address this, we propose GNNMoE, a universal node classification framework based on the Mixture-of-Experts (MoE) mechanism. The framework first constructs diverse message-passing experts through recombination of fine-grained encoding operators, then designs soft and hard gating layers to allocate the most suitable expert networks for each node's representation learning, thereby enhancing both model expressiveness and adaptability to diverse graphs. Furthermore, considering that soft gating might introduce encoding noise in homophilous scenarios, we introduce an entropy constraint to guide sharpening of soft gates, achieving organic integration of weighted combination and Top-K selection. Extensive experiments demonstrate that GNNMoE significantly outperforms mainstream GNNs, heterophilous GNNs, and graph transformers in both node classification performance and universality across diverse graph datasets.
Problem

Research questions and friction points this paper is trying to address.

Addresses node classification in diverse graphs
Enhances GNN adaptability via Mixture-of-Experts
Introduces entropy constraint for encoding optimization
Innovation

Methods, ideas, or system contributions that make the work stand out.

Mixture-of-Experts mechanism for node classification
Soft and hard gating layers for expert allocation
Entropy constraint to sharpen soft gating
🔎 Similar Papers
No similar papers found.