CO-EVOLVE: Bidirectional Co-Evolution of Graph Structure and Semantics for Heterophilous Learning

๐Ÿ“… 2026-03-19
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses the semantic-structural inconsistency and error propagation arising from static unidirectional fusion between large language models (LLMs) and graph neural networks (GNNs) on heterophilous graphs. To this end, we propose a bidirectional co-evolution framework that models graph structure and semantic embeddings as dynamically mutually reinforcing latent variables. Through Gaussโ€“Seidel alternating optimization, the framework establishes cyclic feedback: the GNN guides the LLM via soft prompts, while the LLM constructs a dynamic semantic graph to reconstruct the GNNโ€™s input. We introduce a novel conflict-aware contrastive loss to align high-order topological boundaries, design an adaptive node gating mechanism to fuse static and learnable structures, and propose an uncertainty-gated consistency strategy for metacognitive alignment. Experiments demonstrate that our method achieves an average accuracy gain of 9.07% and a 7.19% improvement in F1 score on public benchmarks.

Technology Category

Application Category

๐Ÿ“ Abstract
The integration of Large Language Models (LLMs) and Graph Neural Networks (GNNs) promises to unify semantic understanding with structural reasoning, yet existing methods typically rely on static, unidirectional pipelines. These approaches suffer from fundamental limitations: (1) Bidirectional Error Propagation, where semantic hallucinations in LLMs or structural noise in GNNs permanently poison the downstream modality without opportunity for recourse; (2) Semantic-Structural Dissonance, particularly in heterophilous settings where textual similarity contradicts topological reality; (3) a Blind Leading the Blind phenomenon, where indiscriminate alignment forces models to mirror each other's mistakes regardless of uncertainty. To address these challenges, we propose CO-EVOLVE, a dual-view co-evolution framework that treats graph topology and semantic embeddings as dynamic, mutually reinforcing latent variables. By employing a Gauss-Seidel alternating optimization strategy, our framework establishes a cyclic feedback loop: the GNN injects structural context as Soft Prompts to guide the LLM, while the LLM constructs favorable Dynamic Semantic Graphs to rewire the GNN. We introduce three key innovations to stabilize this evolution: (1) a Hard-Structure Conflict-Aware Contrastive Loss that warps the semantic manifold to respect high-order topological boundaries; (2) an Adaptive Node Gating Mechanism that dynamically fuses static and learnable structures to recover missing links; (3) an Uncertainty-Gated Consistency strategy that enables meta-cognitive alignment, ensuring models only learn from the confident view. Finally, an Entropy-Aware Adaptive Fusion integrates predictions during inference. Extensive experiments on public benchmarks demonstrate that CO-EVOLVE significantly outperforms state-of-the-art baselines, achieving average improvements of 9.07% in Accuracy and 7.19% in F1-score.
Problem

Research questions and friction points this paper is trying to address.

heterophilous graphs
semantic-structural dissonance
bidirectional error propagation
LLM-GNN integration
uncertainty-aware alignment
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bidirectional Co-Evolution
Heterophilous Graph Learning
Dynamic Semantic Graph
Uncertainty-Gated Consistency
Contrastive Loss
๐Ÿ”Ž Similar Papers
No similar papers found.