Convolution-Weight-Distribution Assumption: Rethinking the Criteria of Channel Pruning

04/24/2020
by   Zhongzhan Huang, et al.
0

Channel pruning is one of the most important techniques for compressing neural networks with convolutional filters. However, in our study, we find strong similarities among some primary pruning criteria proposed in recent years. The sequence of filters'"importance" in a convolutional layer according to these criteria are almost the same, resulting in similar pruned structures. This finding can be explained by our assumption that the trained convolutional filters approximately follow a Gaussian-alike distribution, which is demonstrated through systematic and comprehensive statistical tests. Under this assumption, the similarity of these criteria is theoretically proved. Moreover, we also find that if the network has too much redundancy(exists a large number of filters in each convolutional layer), then these criteria can not distinguish the "importance" of the filters. This phenomenon is due to that the convolutional layer will form a special geometric structure when redundancy is large enough and our assumption holds: for every pair of filters in one layer, (1)Their ℓ_2 norm are equivalent; (2)They are equidistant; (3)and they are orthogonal. The full appendix is released at https://github.com/dedekinds/CWDA.

READ FULL TEXT
research
12/14/2021

SNF: Filter Pruning via Searching the Proper Number of Filters

Convolutional Neural Network (CNN) has an amount of parameter redundancy...
research
04/08/2021

Convolutional Neural Network Pruning with Structural Redundancy Reduction

Convolutional neural network (CNN) pruning has become one of the most su...
research
05/16/2019

Investigating Channel Pruning through Structural Redundancy Reduction - A Statistical Study

Most existing channel pruning methods formulate the pruning task from a ...
research
02/16/2023

WHC: Weighted Hybrid Criterion for Filter Pruning on Convolutional Neural Networks

Filter pruning has attracted increasing attention in recent years for it...
research
10/09/2021

Weight Evolution: Improving Deep Neural Networks Training through Evolving Inferior Weight Values

To obtain good performance, convolutional neural networks are usually ov...
research
11/18/2018

RePr: Improved Training of Convolutional Filters

A well-trained Convolutional Neural Network can easily be pruned without...
research
01/20/2021

Non-Parametric Adaptive Network Pruning

Popular network pruning algorithms reduce redundant information by optim...

Please sign up or login with your details

Forgot password? Click here to reset