Influence Function Based Second-Order Channel Pruning-Evaluating True Loss Changes For Pruning Is Possible Without Retraining

08/13/2023
by   Hongrong Cheng, et al.
0

A challenge of channel pruning is designing efficient and effective criteria to select channels to prune. A widely used criterion is minimal performance degeneration. To accurately evaluate the truth performance degeneration requires retraining the survived weights to convergence, which is prohibitively slow. Hence existing pruning methods use previous weights (without retraining) to evaluate the performance degeneration. However, we observe the loss changes differ significantly with and without retraining. It motivates us to develop a technique to evaluate true loss changes without retraining, with which channels to prune can be selected more reliably and confidently. We first derive a closed-form estimator of the true loss change per pruning mask change, using influence functions without retraining. Influence function which is from robust statistics reveals the impacts of a training sample on the model's prediction and is repurposed by us to assess impacts on true loss changes. We then show how to assess the importance of all channels simultaneously and develop a novel global channel pruning algorithm accordingly. We conduct extensive experiments to verify the effectiveness of the proposed algorithm. To the best of our knowledge, we are the first that shows evaluating true loss changes for pruning without retraining is possible. This finding will open up opportunities for a series of new paradigms to emerge that differ from existing pruning methods. The code is available at https://github.com/hrcheng1066/IFSO.

READ FULL TEXT
research
04/24/2021

Carrying out CNN Channel Pruning in a White Box

Channel Pruning has been long adopted for compressing CNNs, which signif...
research
05/07/2020

DMCP: Differentiable Markov Channel Pruning for Neural Networks

Recent works imply that the channel pruning can be regarded as searching...
research
12/05/2021

Inf-CP: A Reliable Channel Pruning based on Channel Influence

One of the most effective methods of channel pruning is to trim on the b...
research
03/15/2020

Channel Pruning Guided by Classification Loss and Feature Importance

In this work, we propose a new layer-by-layer channel pruning method cal...
research
08/05/2022

Data-free Backdoor Removal based on Channel Lipschitzness

Recent studies have shown that Deep Neural Networks (DNNs) are vulnerabl...
research
03/10/2020

Channel Pruning via Optimal Thresholding

Structured pruning, especially channel pruning is widely used for the re...
research
06/21/2023

Fantastic Weights and How to Find Them: Where to Prune in Dynamic Sparse Training

Dynamic Sparse Training (DST) is a rapidly evolving area of research tha...

Please sign up or login with your details

Forgot password? Click here to reset