Dynamic Compression Ratio Selection for Edge Inference Systems with Hard Deadlines

05/25/2020
by   Xiufeng Huang, et al.
0

Implementing machine learning algorithms on Internet of things (IoT) devices has become essential for emerging applications, such as autonomous driving, environment monitoring. But the limitations of computation capability and energy consumption make it difficult to run complex machine learning algorithms on IoT devices, especially when latency deadline exists. One solution is to offload the computation intensive tasks to the edge server. However, the wireless uploading of the raw data is time consuming and may lead to deadline violation. To reduce the communication cost, lossy data compression can be exploited for inference tasks, but may bring more erroneous inference results. In this paper, we propose a dynamic compression ratio selection scheme for edge inference system with hard deadlines. The key idea is to balance the tradeoff between communication cost and inference accuracy. By dynamically selecting the optimal compression ratio with the remaining deadline budgets for queued tasks, more tasks can be timely completed with correct inference under limited communication resources. Furthermore, information augmentation that retransmits less compressed data of task with erroneous inference, is proposed to enhance the accuracy performance. While it is often hard to know the correctness of inference, we use uncertainty to estimate the confidence of the inference, and based on that, jointly optimize the information augmentation and compression ratio selection. Lastly, considering the wireless transmission errors, we further design a retransmission scheme to reduce performance degradation due to packet losses. Simulation results show the performance of the proposed schemes under different deadlines and task arrival rates.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset