On Effectiveness of Graph Neural Network Architectures for Network Digital Twins (NDTs)

📅 2025-08-04
📈 Citations: 0
✨ Influential: 0
📄 PDF
🤖 AI Summary
To address the challenges of managing massive heterogeneous devices and diverse service requirements in future networks (e.g., 6G), conventional network management lacks automation capabilities, while existing AI-driven approaches—relying on real-world training—face safety and cost bottlenecks. This paper proposes a Network Digital Twin (NDT) framework leveraging multi-layer knowledge graphs and Graph Neural Networks (GNNs) to enable risk-free, high-fidelity simulation and accurate prediction of user-perceived key performance indicators (KPIs). We conduct the first systematic empirical evaluation of four representative GNN architectures—GraphTransformer, GCN, GAT, and GIN—on network KPI forecasting. Results show GraphTransformer achieves superior prediction accuracy, whereas the others exhibit notable trade-offs between training efficiency and performance. Validated on real-world RIPE Atlas measurement data, our NDT framework demonstrates high accuracy, scalability, and practical deployability.

Technology Category

Application Category

📝 Abstract
Future networks, such as 6G, will need to support a vast and diverse range of interconnected devices and applications, each with its own set of requirements. While traditional network management approaches will suffice, an automated solutions are becoming a must. However, network automation frameworks are prone to errors, and often they employ ML-based techniques that require training to learn how the network can be optimized. In this sense, network digital twins are a useful tool that allows for the simulation, testing, and training of AI models without affecting the real-world networks and users. This paper presents an AI-based Network Digital Twin (AI-NDT) that leverages a multi-layered knowledge graph architecture and graph neural networks to predict network metrics that directly affect the quality of experience of users. An evaluation of the four most prominent Graph Neural Networks (GNN) architectures was conducted to assess their effectiveness in developing network digital twins. We trained the digital twin on publicly available measurement data from RIPE Atlas, therefore obtaining results close to what is expected in real-world applications. The results show that among the four architectures evaluated, GraphTransformer presents the best performance. However, other architectures might fit better in scenarios where shorter training time is important, while also delivering acceptable results. The results of this work are indicative of what might become common practice for proactive network management, offering a scalable and accurate solution aligned with the requirements of the next-generation networks.
Problem

Research questions and friction points this paper is trying to address.

Evaluating GNN architectures for Network Digital Twins
Predicting network metrics affecting user experience
Comparing performance of four GNN architectures
Innovation

Methods, ideas, or system contributions that make the work stand out.

Leverages multi-layered knowledge graph architecture
Uses Graph Neural Networks for metric prediction
Evaluates four GNN architectures for best performance
🔎 Similar Papers
No similar papers found.
I
Iulisloi Zacarias
Technische Universität Braunschweig
O
Oussama Ben Taarit
Technische Universität Braunschweig
Admela Jukan
Admela Jukan
Professor, TU Braunschweig, Germany
Network ArchitectureOptical NetworkingCloud computingAnimal-Computer Interaction