Recovering from Random Pruning: On the Plasticity of Deep Convolutional Neural Networks

01/31/2018
by   Deepak Mittal, et al.
0

Recently there has been a lot of work on pruning filters from deep convolutional neural networks (CNNs) with the intention of reducing computations. The key idea is to rank the filters based on a certain criterion (say, l_1-norm, average percentage of zeros, etc) and retain only the top ranked filters. Once the low scoring filters are pruned away the remainder of the network is fine tuned and is shown to give performance comparable to the original unpruned network. In this work, we report experiments which suggest that the comparable performance of the pruned network is not due to the specific criterion chosen but due to the inherent plasticity of deep neural networks which allows them to recover from the loss of pruned filters once the rest of the filters are fine-tuned. Specifically, we show counter-intuitive results wherein by randomly pruning 25-50% filters from deep CNNs we are able to obtain the same performance as obtained by using state of the art pruning methods. We empirically validate our claims by doing an exhaustive evaluation with VGG-16 and ResNet-50. Further, we also evaluate a real world scenario where a CNN trained on all 1000 ImageNet classes needs to be tested on only a small set of classes at test time (say, only animals). We create a new benchmark dataset from ImageNet to evaluate such class specific pruning and show that even here a random pruning strategy gives close to state of the art performance. Lastly, unlike existing approaches which mainly focus on the task of image classification, in this work we also report results on object detection. We show that using a simple random pruning strategy we can achieve significant speed up in object detection (74% improvement in fps) while retaining the same accuracy as that of the original Faster RCNN model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/26/2018

Studying the Plasticity in Deep Convolutional Neural Networks using Random Pruning

Recently there has been a lot of work on pruning filters from deep convo...
research
09/10/2020

OrthoReg: Robust Network Pruning Using Orthonormality Regularization

Network pruning in Convolutional Neural Networks (CNNs) has been extensi...
research
04/08/2019

Meta Filter Pruning to Accelerate Deep Convolutional Neural Networks

Existing methods usually utilize pre-defined criterions, such as p-norm,...
research
04/08/2019

Centripetal SGD for Pruning Very Deep Convolutional Networks with Complicated Structure

The redundancy is widely recognized in Convolutional Neural Networks (CN...
research
02/16/2023

WHC: Weighted Hybrid Criterion for Filter Pruning on Convolutional Neural Networks

Filter pruning has attracted increasing attention in recent years for it...
research
01/07/2021

L2PF – Learning to Prune Faster

Various applications in the field of autonomous driving are based on con...
research
06/21/2018

Finding Original Image Of A Sub Image Using CNNs

Convolututional Neural Networks have achieved state of the art in image ...

Please sign up or login with your details

Forgot password? Click here to reset