Characterizing the Deep Neural Networks Inference Performance of Mobile Applications

09/10/2019
by   Samuel S. Ogden, et al.
0

Today's mobile applications are increasingly leveraging deep neural networks to provide novel features, such as image and speech recognitions. To use a pre-trained deep neural network, mobile developers can either host it in a cloud server, referred to as cloud-based inference, or ship it with their mobile application, referred to as on-device inference. In this work, we investigate the inference performance of these two common approaches on both mobile devices and public clouds, using popular convolutional neural networks. Our measurement study suggests the need for both on-device and cloud-based inferences for supporting mobile applications. In particular, newer mobile devices is able to run mobile-optimized CNN models in reasonable time. However, for older mobile devices or to use more complex CNN models, mobile applications should opt in for cloud-based inference. We further demonstrate that variable network conditions can lead to poor cloud-based inference end-to-end time. To support efficient cloud-based inference, we propose a CNN model selection algorithm called CNNSelect that dynamically selects the most appropriate CNN model for each inference request, and adapts its selection to match different SLAs and execution time budgets that are caused by variable mobile environments. The key idea of CNNSelect is to make inference speed and accuracy trade-offs at runtime using a set of CNN models. We demonstrated that CNNSelect smoothly improves inference accuracy while maintaining SLA attainment in 88.5 more cases than a greedy baseline.

READ FULL TEXT

page 4

page 12

research
09/04/2019

ModiPick: SLA-aware Accuracy Optimization For Mobile Deep Inference

Mobile applications are increasingly leveraging complex deep learning mo...
research
09/16/2010

CloneCloud: Boosting Mobile Device Applications Through Cloud Clone Execution

Mobile applications are becoming increasingly ubiquitous and provide eve...
research
02/16/2020

MDInference: Balancing Inference Accuracy andLatency for Mobile Applications

Deep Neural Networks (DNNs) are allowing mobile devices to incorporate a...
research
02/16/2020

MDInference: Balancing Inference Accuracy and Latency for Mobile Applications

Deep Neural Networks (DNNs) are allowing mobile devices to incorporate a...
research
07/14/2017

Cloud-based or On-device: An Empirical Study of Mobile Deep Inference

Modern mobile applications are benefiting significantly from the advance...
research
12/05/2019

Perseus: Characterizing Performance and Cost of Multi-Tenant Serving for CNN Models

Deep learning models are increasingly used for end-user applications, su...
research
06/17/2022

PICO: Pipeline Inference Framework for Versatile CNNs on Diverse Mobile Devices

Recent researches in artificial intelligence have proposed versatile con...

Please sign up or login with your details

Forgot password? Click here to reset