Stability Based Filter Pruning for Accelerating Deep CNNs

11/20/2018
by   Pravendra Singh, et al.
0

Convolutional neural networks (CNN) have achieved impressive performance on the wide variety of tasks (classification, detection, etc.) across multiple domains at the cost of high computational and memory requirements. Thus, leveraging CNNs for real-time applications necessitates model compression approaches that not only reduce the total number of parameters but reduce the overall computation as well. In this work, we present a stability-based approach for filter-level pruning of CNNs. We evaluate our proposed approach on different architectures (LeNet, VGG-16, ResNet, and Faster RCNN) and datasets and demonstrate its generalizability through extensive experiments. Moreover, our compressed models can be used at run-time without requiring any special libraries or hardware. Our model compression method reduces the number of FLOPS by an impressive factor of 6.03X and GPU memory footprint by more than 17X, significantly outperforming other state-of-the-art filter pruning methods.

READ FULL TEXT
research
05/11/2019

Play and Prune: Adaptive Filter Pruning for Deep Model Compression

While convolutional neural networks (CNN) have achieved impressive perfo...
research
08/08/2019

Efficient Inference of CNNs via Channel Pruning

The deployment of Convolutional Neural Networks (CNNs) on resource const...
research
08/22/2017

Learning Efficient Convolutional Networks through Network Slimming

The deployment of deep convolutional neural networks (CNNs) in many real...
research
09/06/2018

2PFPCE: Two-Phase Filter Pruning Based on Conditional Entropy

Deep Convolutional Neural Networks (CNNs) offer remarkable performance o...
research
04/01/2023

Progressive Channel-Shrinking Network

Currently, salience-based channel pruning makes continuous breakthroughs...
research
02/17/2023

Less is More: The Influence of Pruning on the Explainability of CNNs

Modern, state-of-the-art Convolutional Neural Networks (CNNs) in compute...
research
01/23/2019

Towards Compact ConvNets via Structure-Sparsity Regularized Filter Pruning

The success of convolutional neural networks (CNNs) in computer vision a...

Please sign up or login with your details

Forgot password? Click here to reset