QECO: A QoE-Oriented Computation Offloading Algorithm Based on Deep Reinforcement Learning for Mobile Edge Computing

📅 2023-11-04
🏛️ IEEE Transactions on Network Science and Engineering
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of Quality-of-Experience (QoE) assurance in dynamic and uncertain mobile edge computing (MEC) environments, this paper formulates the problem as a user-centric Markov Decision Process. We propose the first distributed, QoE-driven task offloading framework that requires neither inter-device coordination nor global state awareness. Crucially, we pioneer the joint embedding of user mobility patterns and real-time edge server load into the deep reinforcement learning (DRL) reward function. Leveraging a Deep Q-Network (DQN) architecture and a multi-objective QoE quantification model, our approach enables low-overhead, online decision-making. Experimental results demonstrate significant improvements over state-of-the-art methods: 14.4% higher task completion rate, 9.2% lower average latency, 6.3% reduced terminal energy consumption, and a 37.1% overall QoE gain.
📝 Abstract
In the realm of mobile edge computing (MEC), efficient computation task offloading plays a pivotal role in ensuring a seamless quality of experience (QoE) for users. Maintaining a high QoE is paramount in today's interconnected world, where users demand reliable services. This challenge stands as one of the most primary key factors contributing to handling dynamic and uncertain mobile environments. In this study, we delve into computation offloading in MEC systems, where strict task processing deadlines and energy constraints can adversely affect the system performance. We formulate the computation task offloading problem as a Markov decision process (MDP) to maximize the long-term QoE of each user individually. We propose a distributed QoE-oriented computation offloading (QECO) algorithm based on deep reinforcement learning (DRL) that empowers mobile devices to make their offloading decisions without requiring knowledge of decisions made by other devices. Through numerical studies, we evaluate the performance of QECO. Simulation results reveal that compared to the state-of-the-art existing works, QECO increases the number of completed tasks by up to 14.4%, while simultaneously reducing task delay and energy consumption by 9.2% and 6.3%, respectively. Together, these improvements result in a significant average QoE enhancement of 37.1%. This substantial improvement is achieved by accurately accounting for user dynamics and edge server workloads when making intelligent offloading decisions. This highlights QECO's effectiveness in enhancing users' experience in MEC systems.
Problem

Research questions and friction points this paper is trying to address.

Optimizing computation offloading in mobile edge computing
Maximizing user QoE under energy and deadline constraints
Addressing dynamic mobile environments with distributed DRL algorithm
Innovation

Methods, ideas, or system contributions that make the work stand out.

Deep reinforcement learning optimizes mobile edge computation offloading
Markov decision process maximizes long-term user QoE
Distributed algorithm enables independent device offloading decisions
🔎 Similar Papers
No similar papers found.