AdaDeep: A Usage-Driven, Automated Deep Model Compression Framework for Enabling Ubiquitous Intelligent Mobiles

06/08/2020
by   Sicong Liu, et al.
0

Recent breakthroughs in Deep Neural Networks (DNNs) have fueled a tremendously growing demand for bringing DNN-powered intelligence into mobile platforms. While the potential of deploying DNNs on resource-constrained platforms has been demonstrated by DNN compression techniques, the current practice suffers from two limitations: 1) merely stand-alone compression schemes are investigated even though each compression technique only suit for certain types of DNN layers; and 2) mostly compression techniques are optimized for DNNs' inference accuracy, without explicitly considering other application-driven system performance (e.g., latency and energy cost) and the varying resource availability across platforms (e.g., storage and processing capability). To this end, we propose AdaDeep, a usage-driven, automated DNN compression framework for systematically exploring the desired trade-off between performance and resource constraints, from a holistic system level. Specifically, in a layer-wise manner, AdaDeep automatically selects the most suitable combination of compression techniques and the corresponding compression hyperparameters for a given DNN. Thorough evaluations on six datasets and across twelve devices demonstrate that AdaDeep can achieve up to 18.6× latency reduction, 9.8× energy-efficiency improvement, and 37.3× storage reduction in DNNs while incurring negligible accuracy loss. Furthermore, AdaDeep also uncovers multiple novel combinations of compression techniques.

READ FULL TEXT
research
01/28/2021

AdaSpring: Context-adaptive and Runtime-evolutionary Deep Model Compression for Mobile Applications

There are many deep learning (e.g., DNN) powered mobile and wearable app...
research
08/28/2020

MCMIA: Model Compression Against Membership Inference Attack in Deep Neural Networks

Deep learning or deep neural networks (DNNs) have nowadays enabled high ...
research
11/15/2019

ASCAI: Adaptive Sampling for acquiring Compact AI

This paper introduces ASCAI, a novel adaptive sampling methodology that ...
research
07/22/2023

MIMONet: Multi-Input Multi-Output On-Device Deep Learning

Future intelligent robots are expected to process multiple inputs simult...
research
12/18/2021

LegoDNN: Block-grained Scaling of Deep Neural Networks for Mobile Vision

Deep neural networks (DNNs) have become ubiquitous techniques in mobile ...
research
12/05/2018

ECC: Energy-Constrained Deep Neural Network Compression via a Bilinear Regression Model

Many DNN-enabled vision applications constantly operate under severe ene...
research
07/11/2023

Number Systems for Deep Neural Network Architectures: A Survey

Deep neural networks (DNNs) have become an enabling component for a myri...

Please sign up or login with your details

Forgot password? Click here to reset