Stochastic Control of Computation Offloading to a Helper with a Dynamically Loaded CPU

02/27/2018
by   Yunzheng Tao, et al.
0

Due to densification of wireless networks, there exist abundance of idling computation resources at edge devices. These resources can be scavenged by offloading heavy computation tasks from small IoT devices in proximity, thereby overcoming their limitations and lengthening their battery lives. However, unlike dedicated servers, the spare resources offered by edge helpers are random and intermittent. Thus, it is essential for a user to intelligently control the amounts of data for offloading and local computing so as to ensure a computation task can be finished in time consuming minimum energy. In this paper, we design energy-efficient control policies in a computation offloading system with a random channel and a helper with a dynamically loaded CPU. Specifically, the policy adopted by the helper aims at determining the sizes of offloaded and locally-computed data for a given task in different slots such that the total energy consumption for transmission and local CPU is minimized under a task-deadline constraint. As the result, the polices endow an offloading user robustness against channel-and-helper randomness besides balancing offloading and local computing. By modeling the channel and helper-CPU as Markov chains, the problem of offloading control is converted into a Markov-decision process. Though dynamic programming (DP) for numerically solving the problem does not yield the optimal policies in closed form, we leverage the procedure to quantify the optimal policy structure and apply the result to design optimal or sub-optimal policies. For different cases ranging from zero to large buffers, the low-complexity of the policies overcomes the "curse-of-dimensionality" in DP arising from joint consideration of channel, helper CPU and buffer states.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/30/2017

Device-centric Energy Optimization for Edge Cloud Offloading

A wireless system is considered, where, computationally complex algorith...
research
03/25/2018

Performance Optimization in Mobile-Edge Computing via Deep Reinforcement Learning

To improve the quality of computation experience for mobile devices, mob...
research
01/30/2018

Cost- and Energy-Aware Multi-Flow Mobile Data Offloading Using Markov Decision Process

With the rapid increase in demand for mobile data, mobile network operat...
research
06/09/2023

A Dynamic Partial Computation Offloading for the Metaverse in In-Network Computing

The In-Network Computing (COIN) paradigm is a promising solution that le...
research
05/16/2018

Optimized Computation Offloading Performance in Virtual Edge Computing Systems via Deep Reinforcement Learning

To improve the quality of computation experience for mobile devices, mob...
research
09/27/2022

Timeliness of Information for Computation-intensive Status Updates in Task-oriented Communications

Moving beyond just interconnected devices, the increasing interplay betw...
research
01/05/2019

Optimal Asynchronous Dynamic Policies in Energy-Efficient Data Centers

In this paper, we use a Markov decision process to find optimal asynchro...

Please sign up or login with your details

Forgot password? Click here to reset