Pruning Convolutional Neural Networks for Resource Efficient Inference

11/19/2016
by   Pavlo Molchanov, et al.
0

We propose a new formulation for pruning convolutional kernels in neural networks to enable efficient inference. We interleave greedy criteria-based pruning with fine-tuning by backpropagation - a computationally efficient procedure that maintains good generalization in the pruned network. We propose a new criterion based on Taylor expansion that approximates the change in the cost function induced by pruning network parameters. We focus on transfer learning, where large pretrained networks are adapted to specialized tasks. The proposed criterion demonstrates superior performance compared to other criteria, e.g. the norm of kernel weights or feature map activation, for pruning large CNNs after adaptation to fine-grained classification tasks (Birds-200 and Flowers-102) relaying only on the first order gradient information. We also show that pruning can lead to more than 10x theoretical (5x practical) reduction in adapted 3D-convolutional filters with a small drop in accuracy in a recurrent gesture classifier. Finally, we show results for the large-scale ImageNet dataset to emphasize the flexibility of our approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/15/2018

Efficient Hardware Realization of Convolutional Neural Networks using Intra-Kernel Regular Pruning

The recent trend toward increasingly deep convolutional neural networks ...
research
11/26/2021

How Well Do Sparse Imagenet Models Transfer?

Transfer learning is a classic paradigm by which models pretrained on la...
research
05/28/2020

A Feature-map Discriminant Perspective for Pruning Deep Neural Networks

Network pruning has become the de facto tool to accelerate deep neural n...
research
12/18/2019

Pruning by Explaining: A Novel Criterion for Deep Neural Network Pruning

The success of convolutional neural networks (CNNs) in various applicati...
research
02/16/2023

WHC: Weighted Hybrid Criterion for Filter Pruning on Convolutional Neural Networks

Filter pruning has attracted increasing attention in recent years for it...
research
08/06/2021

Basis Scaling and Double Pruning for Efficient Transfer Learning

Transfer learning allows the reuse of deep learning features on new data...
research
09/08/2020

CNNPruner: Pruning Convolutional Neural Networks with Visual Analytics

Convolutional neural networks (CNNs) have demonstrated extraordinarily g...

Please sign up or login with your details

Forgot password? Click here to reset