🤖 AI Summary
To address type semantic loss and structural noise in heterogeneous graph neural networks (HGNNs) for modeling heterogeneous information networks (HINs), this paper proposes a type-aware graph autoencoder framework integrated with guided graph enhancement. The method jointly optimizes representation learning and graph structure refinement through two core innovations: (1) a decoder-driven dynamic graph enhancement mechanism that adaptively refines the adjacency structure based on reconstruction feedback; and (2) a schema-constrained edge reconstruction auxiliary task that preserves type semantic consistency while suppressing spurious edges. The framework unifies a heterogeneous graph autoencoder, type-aware message passing, and schema-aware reconstruction loss. Extensive experiments on IMDB, ACM, and DBLP demonstrate state-of-the-art performance: classification accuracy improves by 2.1–4.7% over leading HGNN baselines, while computational overhead decreases by 15–22%. The approach achieves superior robustness, generalization, and efficiency.
📝 Abstract
Heterogeneous Graph Neural Networks (HGNNs) are effective for modeling Heterogeneous Information Networks (HINs), which encode complex multi-typed entities and relations. However, HGNNs often suffer from type information loss and structural noise, limiting their representational fidelity and generalization. We propose THeGAU, a model-agnostic framework that combines a type-aware graph autoencoder with guided graph augmentation to improve node classification. THeGAU reconstructs schema-valid edges as an auxiliary task to preserve node-type semantics and introduces a decoder-driven augmentation mechanism to selectively refine noisy structures. This joint design enhances robustness, accuracy, and efficiency while significantly reducing computational overhead. Extensive experiments on three benchmark HIN datasets (IMDB, ACM, and DBLP) demonstrate that THeGAU consistently outperforms existing HGNN methods, achieving state-of-the-art performance across multiple backbones.