Transformer-Empowered Actor-Critic Reinforcement Learning for Sequence-Aware Service Function Chain Partitioning

📅 2025-04-26
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenges of low-latency constraints, resource scarcity, and difficulty in modeling sequential dependencies in cross-domain service function chain (SFC) embedding within 6G multi-domain networks, this paper proposes a sequence-aware deep reinforcement learning framework. The method innovatively integrates the Transformer’s self-attention mechanism into an Actor-Critic architecture to explicitly capture long-range temporal dependencies among virtualized network functions (VNFs). It further introduces an ε-LoPe exploration strategy and an asymptotic return normalization mechanism to significantly enhance training stability and convergence speed. Experimental results demonstrate that the proposed approach consistently outperforms state-of-the-art methods across key metrics—including long-term acceptance rate, resource utilization efficiency, and scalability—while achieving low inference latency, making it well-suited for large-scale, dynamic 6G network environments.

Technology Category

Application Category

📝 Abstract
In the forthcoming era of 6G networks, characterized by unprecedented data rates, ultra-low latency, and extensive connectivity, effective management of Virtualized Network Functions (VNFs) is essential. VNFs are software-based counterparts of traditional hardware devices that facilitate flexible and scalable service provisioning. Service Function Chains (SFCs), structured as ordered sequences of VNFs, are pivotal in orchestrating complex network services. Nevertheless, partitioning SFCs across multi-domain network infrastructures presents substantial challenges due to stringent latency constraints and limited resource availability. Conventional optimization-based methods typically exhibit low scalability, whereas existing data-driven approaches often fail to adequately balance computational efficiency with the capability to effectively account for dependencies inherent in SFCs. To overcome these limitations, we introduce a Transformer-empowered actor-critic framework specifically designed for sequence-aware SFC partitioning. By utilizing the self-attention mechanism, our approach effectively models complex inter-dependencies among VNFs, facilitating coordinated and parallelized decision-making processes. Additionally, we enhance training stability and convergence using $epsilon$-LoPe exploration strategy as well as Asymptotic Return Normalization. Comprehensive simulation results demonstrate that the proposed methodology outperforms existing state-of-the-art solutions in terms of long-term acceptance rates, resource utilization efficiency, and scalability, while achieving rapid inference. This study not only advances intelligent network orchestration by delivering a scalable and robust solution for SFC partitioning within emerging 6G environments, but also bridging recent advancements in Large Language Models (LLMs) with the optimization of next-generation networks.
Problem

Research questions and friction points this paper is trying to address.

Partitioning Service Function Chains (SFCs) across multi-domain networks with latency constraints.
Balancing computational efficiency and dependency modeling in SFC optimization.
Enhancing scalability and resource utilization in 6G network orchestration.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Transformer-empowered actor-critic framework for SFC partitioning
Self-attention mechanism models VNF inter-dependencies effectively
Enhanced training with ε-LoPe and Asymptotic Return Normalization
🔎 Similar Papers
No similar papers found.