ABG-NAS: Adaptive Bayesian Genetic Neural Architecture Search for Graph Representation Learning

📅 2025-04-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing graph neural networks (GNNs) struggle to adapt to diverse and complex graph structures, resulting in insufficient representation robustness and generalization. To address this, we propose an end-to-end automated neural architecture search (NAS) framework. Our method introduces a novel two-stage cooperative search strategy: (i) coarse-grained architecture exploration via an adaptive genetic algorithm in the first stage; and (ii) fine-grained hyperparameter refinement using Bayesian optimization in the second stage. We design a comprehensive, fine-grained search space encompassing both propagation and transformation operations to explicitly model graph structural characteristics. Furthermore, we integrate differentiable architecture encoding with meta-learning to enhance search efficiency and cross-task generalization. Evaluated on standard benchmarks—including Cora and PubMed—our framework achieves average classification accuracy improvements of 1.8–3.2% over handcrafted GNNs and state-of-the-art NAS approaches, while accelerating search by 2.1×.

Technology Category

Application Category

📝 Abstract
Effective and efficient graph representation learning is essential for enabling critical downstream tasks, such as node classification, link prediction, and subgraph search. However, existing graph neural network (GNN) architectures often struggle to adapt to diverse and complex graph structures, limiting their ability to provide robust and generalizable representations. To address this challenge, we propose ABG-NAS, a novel framework for automated graph neural network architecture search tailored for efficient graph representation learning. ABG-NAS encompasses three key components: a Comprehensive Architecture Search Space (CASS), an Adaptive Genetic Optimization Strategy (AGOS), and a Bayesian-Guided Tuning Module (BGTM). CASS systematically explores diverse propagation (P) and transformation (T) operations, enabling the discovery of GNN architectures capable of capturing intricate graph characteristics. AGOS dynamically balances exploration and exploitation, ensuring search efficiency and preserving solution diversity. BGTM further optimizes hyperparameters periodically, enhancing the scalability and robustness of the resulting architectures. Empirical evaluations on benchmark datasets (Cora, PubMed, Citeseer, and CoraFull) demonstrate that ABG-NAS consistently outperforms both manually designed GNNs and state-of-the-art neural architecture search (NAS) methods. These results highlight the potential of ABG-NAS to advance graph representation learning by providing scalable and adaptive solutions for diverse graph structures. Our code is publicly available at https://github.com/sserranw/ABG-NAS.
Problem

Research questions and friction points this paper is trying to address.

Adapting GNNs to diverse graph structures
Automating graph neural architecture search
Enhancing scalability and robustness of GNNs
Innovation

Methods, ideas, or system contributions that make the work stand out.

Comprehensive Architecture Search Space (CASS) for GNNs
Adaptive Genetic Optimization Strategy (AGOS) for efficiency
Bayesian-Guided Tuning Module (BGTM) for robustness
🔎 Similar Papers
No similar papers found.
S
Sixuan Wang
Dept. of CSIT, La Trobe University, Melbourne, VIC, Australia
J
Jiao Yin
Dept. of CSIT, La Trobe University, Melbourne, VIC, Australia
Jinli Cao
Jinli Cao
La Trobe University
Internet computingXML dataWeb ServiceDatabase systems
Mingjian Tang
Mingjian Tang
Dept. of CSIT, La Trobe University, Melbourne, VIC, Australia
H
Hua Wang
ISILC, Victoria University, Melbourne, VIC, Australia
Y
Yanchun Zhang
School of CST, Zhejiang Normal University, Jinhua, Zhejiang, China; Peng Cheng Laboratory, Shenzhen, China