Deep Reinforcement Learning for Online Latency Aware Workload Offloading in Mobile Edge Computing
Owing to the resource-constrained feature of Internet of Things (IoT) devices, offloading tasks from IoT devices to the nearby mobile edge computing (MEC) servers can not only save the energy of IoT devices but also reduce the response time of executing the tasks. However, offloading a task to the nearest MEC server may not be the optimal solution due to the limited computing resources of the MEC server. Thus, jointly optimizing the offloading decision and resource management is critical, but yet to be explored. Here, offloading decision refers to where to offload a task and resource management implies how much computing resource in an MEC server is allocated to a task. By considering the waiting time of a task in the communication and computing queues (which are ignored by most of the existing works) as well as tasks priorities, we propose the Deep reinforcement lEarning based offloading deCision and rEsource managemeNT (DECENT) algorithm, which leverages the advantage actor critic method to optimize the offloading decision and computing resource allocation for each arriving task in real-time such that the cumulative weighted response time can be minimized. The performance of DECENT is demonstrated via different experiments.
READ FULL TEXT