WHC: Weighted Hybrid Criterion for Filter Pruning on Convolutional Neural Networks

02/16/2023
by   Shaowu Chen, et al.
0

Filter pruning has attracted increasing attention in recent years for its capacity in compressing and accelerating convolutional neural networks. Various data-independent criteria, including norm-based and relationship-based ones, were proposed to prune the most unimportant filters. However, these state-of-the-art criteria fail to fully consider the dissimilarity of filters, and thus might lead to performance degradation. In this paper, we first analyze the limitation of relationship-based criteria with examples, and then introduce a new data-independent criterion, Weighted Hybrid Criterion (WHC), to tackle the problems of both norm-based and relationship-based criteria. By taking the magnitude of each filter and the linear dependence between filters into consideration, WHC can robustly recognize the most redundant filters, which can be safely pruned without introducing severe performance degradation to networks. Extensive pruning experiments in a simple one-shot manner demonstrate the effectiveness of the proposed WHC. In particular, WHC can prune ResNet-50 on ImageNet with more than 42 performance loss in top-5 accuracy.

READ FULL TEXT
research
04/08/2019

Meta Filter Pruning to Accelerate Deep Convolutional Neural Networks

Existing methods usually utilize pre-defined criterions, such as p-norm,...
research
03/11/2022

Improve Convolutional Neural Network Pruning by Maximizing Filter Variety

Neural network pruning is a widely used strategy for reducing model stor...
research
12/10/2019

Magnitude and Uncertainty Pruning Criterion for Neural Networks

Neural networks have achieved dramatic improvements in recent years and ...
research
07/14/2020

REPrune: Filter Pruning via Representative Election

Even though norm-based filter pruning methods are widely accepted, it is...
research
04/24/2020

Convolution-Weight-Distribution Assumption: Rethinking the Criteria of Channel Pruning

Channel pruning is one of the most important techniques for compressing ...
research
11/19/2016

Pruning Convolutional Neural Networks for Resource Efficient Inference

We propose a new formulation for pruning convolutional kernels in neural...
research
01/31/2018

Recovering from Random Pruning: On the Plasticity of Deep Convolutional Neural Networks

Recently there has been a lot of work on pruning filters from deep convo...

Please sign up or login with your details

Forgot password? Click here to reset