NISP: Pruning Networks using Neuron Importance Score Propagation

11/16/2017
by   Ruichi Yu, et al.
0

To reduce the significant redundancy in deep Convolutional Neural Networks (CNNs), most existing methods prune neurons by only considering statistics of an individual layer or two consecutive layers (e.g., prune one layer to minimize the reconstruction error of the next layer), ignoring the effect of error propagation in deep networks. In contrast, we argue that it is essential to prune neurons in the entire neuron network jointly based on a unified goal: minimizing the reconstruction error of important responses in the "final response layer" (FRL), which is the second-to-last layer before classification, for a pruned network to retrain its predictive power. Specifically, we apply feature ranking techniques to measure the importance of each neuron in the FRL, and formulate network pruning as a binary integer optimization problem and derive a closed-form solution to it for pruning neurons in earlier layers. Based on our theoretical analysis, we propose the Neuron Importance Score Propagation (NISP) algorithm to propagate the importance scores of final responses to every neuron in the network. The CNN is pruned by removing neurons with least importance, and then fine-tuned to retain its predictive power. NISP is evaluated on several datasets with multiple CNN models and demonstrated to achieve significant acceleration and compression with negligible accuracy loss.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/29/2017

Towards thinner convolutional neural networks through Gradually Global Pruning

Deep network pruning is an effective method to reduce the storage and co...
research
10/06/2020

RANP: Resource Aware Neuron Pruning at Initialization for 3D CNNs

Although 3D Convolutional Neural Networks (CNNs) are essential for most ...
research
12/17/2018

A Layer Decomposition-Recomposition Framework for Neuron Pruning towards Accurate Lightweight Networks

Neuron pruning is an efficient method to compress the network into a sli...
research
11/01/2022

Higher-order mutual information reveals synergistic sub-networks for multi-neuron importance

Quantifying which neurons are important with respect to the classificati...
research
12/05/2021

Inf-CP: A Reliable Channel Pruning based on Channel Influence

One of the most effective methods of channel pruning is to trim on the b...
research
08/03/2023

Wider and Deeper LLM Networks are Fairer LLM Evaluators

Measuring the quality of responses generated by LLMs is a challenging ta...
research
12/31/2021

PCACE: A Statistical Approach to Ranking Neurons for CNN Interpretability

In this paper we introduce a new problem within the growing literature o...

Please sign up or login with your details

Forgot password? Click here to reset