Inf-CP: A Reliable Channel Pruning based on Channel Influence

12/05/2021
by   Bilan Lai, et al.
0

One of the most effective methods of channel pruning is to trim on the basis of the importance of each neuron. However, measuring the importance of each neuron is an NP-hard problem. Previous works have proposed to trim by considering the statistics of a single layer or a plurality of successive layers of neurons. These works cannot eliminate the influence of different data on the model in the reconstruction error, and currently, there is no work to prove that the absolute values of the parameters can be directly used as the basis for judging the importance of the weights. A more reasonable approach is to eliminate the difference between batch data that accurately measures the weight of influence. In this paper, we propose to use ensemble learning to train a model for different batches of data and use the influence function (a classic technique from robust statistics) to learn the algorithm to track the model's prediction and return its training parameter gradient, so that we can determine the responsibility for each parameter, which we call "influence", in the prediction process. In addition, we theoretically prove that the back-propagation of the deep network is a first-order Taylor approximation of the influence function of the weights. We perform extensive experiments to prove that pruning based on the influence function using the idea of ensemble learning will be much more effective than just focusing on error reconstruction. Experiments on CIFAR shows that the influence pruning achieves the state-of-the-art result.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/29/2017

Towards thinner convolutional neural networks through Gradually Global Pruning

Deep network pruning is an effective method to reduce the storage and co...
research
11/16/2017

NISP: Pruning Networks using Neuron Importance Score Propagation

To reduce the significant redundancy in deep Convolutional Neural Networ...
research
08/13/2023

Influence Function Based Second-Order Channel Pruning-Evaluating True Loss Changes For Pruning Is Possible Without Retraining

A challenge of channel pruning is designing efficient and effective crit...
research
03/15/2020

Channel Pruning Guided by Classification Loss and Feature Importance

In this work, we propose a new layer-by-layer channel pruning method cal...
research
11/06/2020

Channel Pruning via Multi-Criteria based on Weight Dependency

Channel pruning has demonstrated its effectiveness in compressing ConvNe...
research
10/24/2021

Exploring Gradient Flow Based Saliency for DNN Model Compression

Model pruning aims to reduce the deep neural network (DNN) model size or...
research
06/16/2023

Magnificent Minified Models

This paper concerns itself with the task of taking a large trained neura...

Please sign up or login with your details

Forgot password? Click here to reset