Pruning via Iterative Ranking of Sensitivity Statistics

06/01/2020
by   Stijn Verdenius, et al.
0

With the introduction of SNIP [arXiv:1810.02340v2], it has been demonstrated that modern neural networks can effectively be pruned before training. Yet, its sensitivity criterion has since been criticized for not propagating training signal properly or even disconnecting layers. As a remedy, GraSP [arXiv:2002.07376v1] was introduced, compromising on simplicity. However, in this work we show that by applying the sensitivity criterion iteratively in smaller steps - still before training - we can improve its performance without difficult implementation. As such, we introduce 'SNIP-it'. We then demonstrate how it can be applied for both structured and unstructured pruning, before and/or during training, therewith achieving state-of-the-art sparsity-performance trade-offs. That is, while already providing the computational benefits of pruning in the training process from the start. Furthermore, we evaluate our methods on robustness to overfitting, disconnection and adversarial attacks as well.

READ FULL TEXT
research
06/05/2021

Feature Flow Regularization: Improving Structured Sparsity in Deep Neural Networks

Pruning is a model compression method that removes redundant parameters ...
research
11/01/2021

Back to Basics: Efficient Network Compression via IMP

Network pruning is a widely used technique for effectively compressing D...
research
12/10/2019

Magnitude and Uncertainty Pruning Criterion for Neural Networks

Neural networks have achieved dramatic improvements in recent years and ...
research
07/05/2021

One-Cycle Pruning: Pruning ConvNets Under a Tight Training Budget

Introducing sparsity in a neural network has been an efficient way to re...
research
08/10/2021

On the Effect of Pruning on Adversarial Robustness

Pruning is a well-known mechanism for reducing the computational cost of...
research
02/03/2022

Robust Binary Models by Pruning Randomly-initialized Networks

We propose ways to obtain robust models against adversarial attacks from...
research
06/29/2022

Cut Inner Layers: A Structured Pruning Strategy for Efficient U-Net GANs

Pruning effectively compresses overparameterized models. Despite the suc...

Please sign up or login with your details

Forgot password? Click here to reset