Rethinking the Value of Network Pruning

10/11/2018
by   Zhuang Liu, et al.
0

Network pruning is widely used for reducing the heavy computational cost of deep models. A typical pruning algorithm is a three-stage pipeline, i.e., training (a large model), pruning and fine-tuning. During pruning, according to a certain criterion, redundant weights are pruned and important weights are kept to best preserve the accuracy. In this work, we make several surprising observations which contradict common beliefs. For all the six state-of-the-art pruning algorithms we examined, fine-tuning a pruned model only gives comparable or even worse performance than training that model with randomly initialized weights. For pruning algorithms which assume a predefined target network architecture, one can get rid of the full pipeline and directly train the target network from scratch. Our observations are consistent for a wide variety of pruning algorithms with multiple network architectures, datasets, and tasks. Our results have several implications: 1) training a large, over-parameterized model is not necessary to obtain an efficient final model, 2) learned "important" weights of the large model are not necessarily useful for the small pruned model, 3) the pruned architecture itself, rather than a set of inherited "important" weights, is what leads to the efficiency benefit in the final model, which suggests that some pruning algorithms could be seen as performing network architecture search.

READ FULL TEXT
research
09/27/2019

Pruning from Scratch

Network pruning is an important research field aiming at reducing comput...
research
07/31/2022

Building an Efficiency Pipeline: Commutativity and Cumulativeness of Efficiency Operators for Transformers

There exists a wide variety of efficiency methods for natural language p...
research
07/05/2021

One-Cycle Pruning: Pruning ConvNets Under a Tight Training Budget

Introducing sparsity in a neural network has been an efficient way to re...
research
09/22/2020

Sanity-Checking Pruning Methods: Random Tickets can Win the Jackpot

Network pruning is a method for reducing test-time computational resourc...
research
03/12/2018

It was the training data pruning too!

We study the current best model (KDG) for question answering on tabular ...
research
10/10/2018

Pruning neural networks: is it time to nip it in the bud?

Pruning is a popular technique for compressing a neural network: a large...
research
08/06/2021

Basis Scaling and Double Pruning for Efficient Transfer Learning

Transfer learning allows the reuse of deep learning features on new data...

Please sign up or login with your details

Forgot password? Click here to reset