Sanity-Checking Pruning Methods: Random Tickets can Win the Jackpot

09/22/2020
by   Jingtong Su, et al.
18

Network pruning is a method for reducing test-time computational resource requirements with minimal performance degradation. Conventional wisdom of pruning algorithms suggests that: (1) Pruning methods exploit information from training data to find good subnetworks; (2) The architecture of the pruned network is crucial for good performance. In this paper, we conduct sanity checks for the above beliefs on several recent unstructured pruning methods and surprisingly find that: (1) A set of methods which aims to find good subnetworks of the randomly-initialized network (which we call "initial tickets"), hardly exploits any information from the training data; (2) For the pruned networks obtained by these methods, randomly changing the preserved weights in each layer, while keeping the total number of preserved weights unchanged per layer, does not affect the final performance. These findings inspire us to choose a series of simple data-independent prune ratios for each layer, and randomly prune each layer accordingly to get a subnetwork (which we call "random tickets"). Experimental results show that our zero-shot random tickets outperforms or attains similar performance compared to existing "initial tickets". In addition, we identify one existing pruning method that passes our sanity checks. We hybridize the ratios in our random ticket with this method and propose a new method called "hybrid tickets", which achieves further improvement.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/18/2020

Pruning Neural Networks at Initialization: Why are We Missing the Mark?

Recent work has explored the possibility of pruning neural networks at i...
research
10/11/2018

Rethinking the Value of Network Pruning

Network pruning is widely used for reducing the heavy computational cost...
research
07/05/2021

Why is Pruning at Initialization Immune to Reinitializing and Shuffling?

Recent studies assessing the efficacy of pruning neural networks methods...
research
05/15/2019

EigenDamage: Structured Pruning in the Kronecker-Factored Eigenbasis

Reducing the test time resource requirements of a neural network while p...
research
11/28/2020

FreezeNet: Full Performance by Reduced Storage Costs

Pruning generates sparse networks by setting parameters to zero. In this...
research
05/21/2019

Revisiting hard thresholding for DNN pruning

The most common method for DNN pruning is hard thresholding of network w...
research
07/08/2020

RicciNets: Curvature-guided Pruning of High-performance Neural Networks Using Ricci Flow

A novel method to identify salient computational paths within randomly w...

Please sign up or login with your details

Forgot password? Click here to reset