End-to-End Sensitivity-Based Filter Pruning

04/15/2022
by   Zahra Babaiee, et al.
10

In this paper, we present a novel sensitivity-based filter pruning algorithm (SbF-Pruner) to learn the importance scores of filters of each layer end-to-end. Our method learns the scores from the filter weights, enabling it to account for the correlations between the filters of each layer. Moreover, by training the pruning scores of all layers simultaneously our method can account for layer interdependencies, which is essential to find a performant sparse sub-network. Our proposed method can train and generate a pruned network from scratch in a straightforward, one-stage training process without requiring a pretrained network. Ultimately, we do not need layer-specific hyperparameters and pre-defined layer budgets, since SbF-Pruner can implicitly determine the appropriate number of channels in each layer. Our experimental results on different network architectures suggest that SbF-Pruner outperforms advanced pruning methods. Notably, on CIFAR-10, without requiring a pretrained baseline network, we obtain 1.02 compared to the baseline reported for state-of-the-art pruning algorithms. This is while SbF-Pruner reduces parameter-count by 52.3 (for ResNet101), which is better than the state-of-the-art pruning algorithms with a high margin of 9.5

READ FULL TEXT
research
10/20/2022

Pruning by Active Attention Manipulation

Filter pruning of a CNN is typically achieved by applying discrete masks...
research
10/28/2020

Data Agnostic Filter Gating for Efficient Deep Networks

To deploy a well-trained CNN model on low-end computation edge devices, ...
research
05/23/2018

AutoPruner: An End-to-End Trainable Filter Pruning Method for Efficient Deep Model Inference

Channel pruning is an important family of methods to speedup deep model'...
research
05/14/2020

Dynamic Sparse Training: Find Efficient Sparse Network From Scratch With Trainable Masked Layers

We present a novel network pruning algorithm called Dynamic Sparse Train...
research
11/22/2019

Graph Pruning for Model Compression

Previous AutoML pruning works utilized individual layer features to auto...
research
10/21/2020

SCOP: Scientific Control for Reliable Neural Network Pruning

This paper proposes a reliable neural network pruning algorithm by setti...
research
09/08/2022

CAP: instance complexity-aware network pruning

Existing differentiable channel pruning methods often attach scaling fac...

Please sign up or login with your details

Forgot password? Click here to reset