Optimizing Deep Learning Inference on Embedded Systems Through Adaptive Model Selection

11/09/2019
by   Vicent Sanz Marco, et al.
0

Deep neural networks ( DNNs ) are becoming a key enabling technology for many application domains. However, on-device inference on battery-powered, resource-constrained embedding systems is often infeasible due to prohibitively long inferencing time and resource requirements of many DNNs. Offloading computation into the cloud is often unacceptable due to privacy concerns, high latency, or the lack of connectivity. While compression algorithms often succeed in reducing inferencing times, they come at the cost of reduced accuracy. This paper presents a new, alternative approach to enable efficient execution of DNNs on embedded devices. Our approach dynamically determines which DNN to use for a given input, by considering the desired accuracy and inference time. It employs machine learning to develop a low-cost predictive model to quickly select a pre-trained DNN to use for a given input and the optimization constraint. We achieve this by first off-line training a predictive model, and then using the learned model to select a DNN model to use for new, unseen inputs. We apply our approach to two representative DNN domains: image classification and machine translation. We evaluate our approach on a Jetson TX2 embedded deep learning platform and consider a range of influential DNN models including convolutional and recurrent neural networks. For image classification, we achieve a 1.8x reduction in inference time with a 7.52 machine translation, we achieve a 1.34x reduction in inference time over the most-capable single model, with little impact on the quality of translation.

READ FULL TEXT

page 1

page 3

research
05/11/2018

Adaptive Selection of Deep Learning Models on Embedded Systems

The recent ground-breaking advances in deep learning networks ( DNNs ) m...
research
07/20/2020

SeqPoint: Identifying Representative Iterations of Sequence-based Neural Networks

The ubiquity of deep neural networks (DNNs) continues to rise, making th...
research
07/21/2021

Performance landscape of resource-constrained platforms targeting DNNs

Over the recent years, a significant number of complex, deep neural netw...
research
03/16/2018

TBD: Benchmarking and Analyzing Deep Neural Network Training

The recent popularity of deep neural networks (DNNs) has generated a lot...
research
10/21/2018

To Compress, or Not to Compress: Characterizing Deep Learning Model Compression for Embedded Inference

The recent advances in deep neural networks (DNNs) make them attractive ...
research
08/21/2020

A Survey on Assessing the Generalization Envelope of Deep Neural Networks at Inference Time for Image Classification

Deep Neural Networks (DNNs) achieve state-of-the-art performance on nume...
research
08/20/2021

Early-exit deep neural networks for distorted images: providing an efficient edge offloading

Edge offloading for deep neural networks (DNNs) can be adaptive to the i...

Please sign up or login with your details

Forgot password? Click here to reset