Pruning Neural Networks at Initialization: Why are We Missing the Mark?

09/18/2020
by   Jonathan Frankle, et al.
9

Recent work has explored the possibility of pruning neural networks at initialization. We assess proposals for doing so: SNIP (Lee et al., 2019), GraSP (Wang et al., 2020), SynFlow (Tanaka et al., 2020), and magnitude pruning. Although these methods surpass the trivial baseline of random pruning, they remain below the accuracy of magnitude pruning after training, and we endeavor to understand why. We show that, unlike pruning after training, accuracy is the same or higher when randomly shuffling which weights these methods prune within each layer or sampling new initial values. As such, the per-weight pruning decisions made by these methods can be replaced by a per-layer choice of the fraction of weights to prune. This property undermines the claimed justifications for these methods and suggests broader challenges with the underlying pruning heuristics, the desire to prune at initialization, or both.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/05/2021

Why is Pruning at Initialization Immune to Reinitializing and Shuffling?

Recent studies assessing the efficacy of pruning neural networks methods...
research
02/19/2020

Pruning untrained neural networks: Principles and Analysis

Overparameterized neural networks display state-of-the art performance. ...
research
03/26/2023

Does `Deep Learning on a Data Diet' reproduce? Overall yes, but GraNd at Initialization does not

The paper 'Deep Learning on a Data Diet' by Paul et al. (2021) introduce...
research
07/01/2022

Studying the impact of magnitude pruning on contrastive learning methods

We study the impact of different pruning techniques on the representatio...
research
06/17/2021

Pruning Randomly Initialized Neural Networks with Iterative Randomization

Pruning the weights of randomly initialized neural networks plays an imp...
research
07/05/2021

Connectivity Matters: Neural Network Pruning Through the Lens of Effective Sparsity

Neural network pruning is a fruitful area of research with surging inter...
research
09/22/2020

Sanity-Checking Pruning Methods: Random Tickets can Win the Jackpot

Network pruning is a method for reducing test-time computational resourc...

Please sign up or login with your details

Forgot password? Click here to reset