Progressive Feature Transmission for Split Inference at the Wireless Edge

12/14/2021
by   Qiao Lan, et al.
0

In edge inference, an edge server provides remote-inference services to edge devices. This requires the edge devices to upload high-dimensional features of data samples over resource-constrained wireless channels, which creates a communication bottleneck. The conventional solution of feature pruning requires that the device has access to the inference model, which is unavailable in the current scenario of split inference. To address this issue, we propose the progressive feature transmission (ProgressFTX) protocol, which minimizes the overhead by progressively transmitting features until a target confidence level is reached. The optimal control policy of the protocol to accelerate inference is derived and it comprises two key operations. The first is importance-aware feature selection at the server, for which it is shown to be optimal to select the most important features, characterized by the largest discriminant gains of the corresponding feature dimensions. The second is transmission-termination control by the server for which the optimal policy is shown to exhibit a threshold structure. Specifically, the transmission is stopped when the incremental uncertainty reduction by further feature transmission is outweighed by its communication cost. The indices of the selected features and transmission decision are fed back to the device in each slot. The optimal policy is first derived for the tractable case of linear classification and then extended to the more complex case of classification using a convolutional neural network. Both Gaussian and fading channels are considered. Experimental results are obtained for both a statistical data model and a real dataset. It is seen that ProgressFTX can substantially reduce the communication latency compared to conventional feature pruning and random feature transmission.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/03/2020

Communication-Computation Trade-Off in Resource-Constrained Edge Inference

The recent breakthrough in artificial intelligence (AI), especially deep...
research
07/28/2023

Efficient Multiuser AI Downloading via Reusable Knowledge Broadcasting

For the 6G mobile networks, in-situ model downloading has emerged as an ...
research
11/02/2022

Task-Oriented Over-the-Air Computation for Multi-Device Edge AI

Departing from the classic paradigm of data-centric designs, the 6G netw...
research
07/30/2020

Capacity of Remote Classification Over Wireless Channels

Wireless connectivity creates a computing paradigm that merges communica...
research
09/01/2021

Task-Oriented Communication for Multi-Device Cooperative Edge Inference

This paper investigates task-oriented communication for multi-device coo...
research
11/15/2022

Enabling AI Quality Control via Feature Hierarchical Edge Inference

With the rise of edge computing, various AI services are expected to be ...
research
04/23/2021

Unsupervised Information Obfuscation for Split Inference of Neural Networks

Splitting network computations between the edge device and a server enab...

Please sign up or login with your details

Forgot password? Click here to reset