One-Cycle Pruning: Pruning ConvNets Under a Tight Training Budget

07/05/2021
by   Nathan Hubens, et al.
1

Introducing sparsity in a neural network has been an efficient way to reduce its complexity while keeping its performance almost intact. Most of the time, sparsity is introduced using a three-stage pipeline: 1) train the model to convergence, 2) prune the model according to some criterion, 3) fine-tune the pruned model to recover performance. The last two steps are often performed iteratively, leading to reasonable results but also to a time-consuming and complex process. In our work, we propose to get rid of the first step of the pipeline and to combine the two other steps in a single pruning-training cycle, allowing the model to jointly learn for the optimal weights while being pruned. We do this by introducing a novel pruning schedule, named One-Cycle Pruning, which starts pruning from the beginning of the training, and until its very end. Adopting such a schedule not only leads to better performing pruned models but also drastically reduces the training budget required to prune a model. Experiments are conducted on a variety of architectures (VGG-16 and ResNet-18) and datasets (CIFAR-10, CIFAR-100 and Caltech-101), and for relatively high sparsity values (80 One-Cycle Pruning consistently outperforms commonly used pruning schedules such as One-Shot Pruning, Iterative Pruning and Automated Gradual Pruning, on a fixed training budget.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/02/2022

Cyclical Pruning for Sparse Neural Networks

Current methods for pruning neural network weights iteratively apply mag...
research
10/11/2018

Rethinking the Value of Network Pruning

Network pruning is widely used for reducing the heavy computational cost...
research
02/18/2019

Single-shot Channel Pruning Based on Alternating Direction Method of Multipliers

Channel pruning has been identified as an effective approach to construc...
research
07/08/2021

Weight Reparametrization for Budget-Aware Network Pruning

Pruning seeks to design lightweight architectures by removing redundant ...
research
10/17/2021

S-Cyc: A Learning Rate Schedule for Iterative Pruning of ReLU-based Networks

We explore a new perspective on adapting the learning rate (LR) schedule...
research
11/01/2021

Back to Basics: Efficient Network Compression via IMP

Network pruning is a widely used technique for effectively compressing D...
research
06/01/2020

Pruning via Iterative Ranking of Sensitivity Statistics

With the introduction of SNIP [arXiv:1810.02340v2], it has been demonstr...

Please sign up or login with your details

Forgot password? Click here to reset