DeepAI AI Chat
Log In Sign Up

Data-Driven Compression of Convolutional Neural Networks

by   Ramit Pahwa, et al.
The University of Texas at Austin

Deploying trained convolutional neural networks (CNNs) to mobile devices is a challenging task because of the simultaneous requirements of the deployed model to be fast, lightweight and accurate. Designing and training a CNN architecture that does well on all three metrics is highly non-trivial and can be very time-consuming if done by hand. One way to solve this problem is to compress the trained CNN models before deploying to mobile devices. This work asks and answers three questions on compressing CNN models automatically: a) How to control the trade-off between speed, memory and accuracy during model compression? b) In practice, a deployed model may not see all classes and/or may not need to produce all class labels. Can this fact be used to improve the trade-off? c) How to scale the compression algorithm to execute within a reasonable amount of time for many deployments? The paper demonstrates that a model compression algorithm utilizing reinforcement learning with architecture search and knowledge distillation can answer these questions in the affirmative. Experimental results are provided for current state-of-the-art CNN model families for image feature extraction like VGG and ResNet with CIFAR datasets.


page 1

page 2

page 3

page 4


Compact CNN Models for On-device Ocular-based User Recognition in Mobile Devices

A number of studies have demonstrated the efficacy of deep learning conv...

Modeling the Resource Requirements of Convolutional Neural Networks on Mobile Devices

Convolutional Neural Networks (CNNs) have revolutionized the research in...

Compressing complex convolutional neural network based on an improved deep compression algorithm

Although convolutional neural network (CNN) has made great progress, lar...

Private Model Compression via Knowledge Distillation

The soaring demand for intelligent mobile applications calls for deployi...

Improved Bayesian Compression

Compression of Neural Networks (NN) has become a highly studied topic in...

Dynamically Hierarchy Revolution: DirNet for Compressing Recurrent Neural Network on Mobile Devices

Recurrent neural networks (RNNs) achieve cutting-edge performance on a v...

Structured Bayesian Compression for Deep models in mobile enabled devices for connected healthcare

Deep Models, typically Deep neural networks, have millions of parameters...