Resource-Constrained Edge AI with Early Exit Prediction

06/15/2022
by   Rongkang Dong, et al.
7

By leveraging the data sample diversity, the early-exit network recently emerges as a prominent neural network architecture to accelerate the deep learning inference process. However, intermediate classifiers of the early exits introduce additional computation overhead, which is unfavorable for resource-constrained edge artificial intelligence (AI). In this paper, we propose an early exit prediction mechanism to reduce the on-device computation overhead in a device-edge co-inference system supported by early-exit networks. Specifically, we design a low-complexity module, namely the Exit Predictor, to guide some distinctly "hard" samples to bypass the computation of the early exits. Besides, considering the varying communication bandwidth, we extend the early exit prediction mechanism for latency-aware edge inference, which adapts the prediction thresholds of the Exit Predictor and the confidence thresholds of the early-exit network via a few simple regression models. Extensive experiment results demonstrate the effectiveness of the Exit Predictor in achieving a better tradeoff between accuracy and on-device computation overhead for early-exit networks. Besides, compared with the baseline methods, the proposed method for latency-aware edge inference attains higher inference accuracy under different bandwidth conditions.

READ FULL TEXT
research
06/03/2020

Communication-Computation Trade-Off in Resource-Constrained Edge Inference

The recent breakthrough in artificial intelligence (AI), especially deep...
research
10/04/2019

Edge AI: On-Demand Accelerating Deep Neural Network Inference via Edge Computing

As a key technology of enabling Artificial Intelligence (AI) application...
research
08/30/2021

Communication-Computation Efficient Device-Edge Co-Inference via AutoML

Device-edge co-inference, which partitions a deep neural network between...
research
11/26/2019

Domain-Aware Dynamic Networks

Deep neural networks with more parameters and FLOPs have higher capacity...
research
05/28/2018

Sacrificing Accuracy for Reduced Computation: Cascaded Inference Based on Softmax Confidence

We study the tradeoff between computational effort and accuracy in a cas...
research
07/14/2022

T-RECX: Tiny-Resource Efficient Convolutional Neural Networks with Early-Exit

Deploying Machine learning (ML) on the milliwatt-scale edge devices (tin...
research
06/24/2022

SCAI: A Spectral data Classification framework with Adaptive Inference for the IoT platform

Currently, it is a hot research topic to realize accurate, efficient, an...

Please sign up or login with your details

Forgot password? Click here to reset