PruneNet: Channel Pruning via Global Importance

05/22/2020
by   Ashish Khetan, et al.
0

Channel pruning is one of the predominant approaches for accelerating deep neural networks. Most existing pruning methods either train from scratch with a sparsity inducing term such as group lasso, or prune redundant channels in a pretrained network and then fine tune the network. Both strategies suffer from some limitations: the use of group lasso is computationally expensive, difficult to converge and often suffers from worse behavior due to the regularization bias. The methods that start with a pretrained network either prune channels uniformly across the layers or prune channels based on the basic statistics of the network parameters. These approaches either ignore the fact that some CNN layers are more redundant than others or fail to adequately identify the level of redundancy in different layers. In this work, we investigate a simple-yet-effective method for pruning channels based on a computationally light-weight yet effective data driven optimization step that discovers the necessary width per layer. Experiments conducted on ILSVRC-12 confirm effectiveness of our approach. With non-uniform pruning across the layers on ResNet-50, we are able to match the FLOP reduction of state-of-the-art channel pruning results while achieving a 0.98% higher accuracy. Further, we show that our pruned ResNet-50 network outperforms ResNet-34 and ResNet-18 networks, and that our pruned ResNet-101 outperforms ResNet-50.

READ FULL TEXT
research
10/28/2018

Discrimination-aware Channel Pruning for Deep Neural Networks

Channel pruning is one of the predominant approaches for deep model comp...
research
05/28/2019

OICSR: Out-In-Channel Sparsity Regularization for Compact Deep Neural Networks

Channel pruning can significantly accelerate and compress deep neural ne...
research
08/09/2019

Group Pruning using a Bounded-Lp norm for Group Gating and Regularization

Deep neural networks achieve state-of-the-art results on several tasks w...
research
05/21/2020

CPOT: Channel Pruning via Optimal Transport

Recent advances in deep neural networks (DNNs) lead to tremendously grow...
research
01/04/2020

Discrimination-aware Network Pruning for Deep Model Compression

We study network pruning which aims to remove redundant channels/kernels...
research
11/14/2022

Pruning Very Deep Neural Network Channels for Efficient Inference

In this paper, we introduce a new channel pruning method to accelerate v...
research
08/06/2019

Exploiting Channel Similarity for Accelerating Deep Convolutional Neural Networks

To address the limitations of existing magnitude-based pruning algorithm...

Please sign up or login with your details

Forgot password? Click here to reset