DeepAI AI Chat
Log In Sign Up

Pruning Neural Networks at Initialization: Why are We Missing the Mark?

09/18/2020
by   Jonathan Frankle, et al.
9

Recent work has explored the possibility of pruning neural networks at initialization. We assess proposals for doing so: SNIP (Lee et al., 2019), GraSP (Wang et al., 2020), SynFlow (Tanaka et al., 2020), and magnitude pruning. Although these methods surpass the trivial baseline of random pruning, they remain below the accuracy of magnitude pruning after training, and we endeavor to understand why. We show that, unlike pruning after training, accuracy is the same or higher when randomly shuffling which weights these methods prune within each layer or sampling new initial values. As such, the per-weight pruning decisions made by these methods can be replaced by a per-layer choice of the fraction of weights to prune. This property undermines the claimed justifications for these methods and suggests broader challenges with the underlying pruning heuristics, the desire to prune at initialization, or both.

READ FULL TEXT

page 1

page 2

page 3

page 4

07/05/2021

Why is Pruning at Initialization Immune to Reinitializing and Shuffling?

Recent studies assessing the efficacy of pruning neural networks methods...
02/19/2020

Pruning untrained neural networks: Principles and Analysis

Overparameterized neural networks display state-of-the art performance. ...
03/26/2023

Does `Deep Learning on a Data Diet' reproduce? Overall yes, but GraNd at Initialization does not

The paper 'Deep Learning on a Data Diet' by Paul et al. (2021) introduce...
07/01/2022

Studying the impact of magnitude pruning on contrastive learning methods

We study the impact of different pruning techniques on the representatio...
06/17/2021

Pruning Randomly Initialized Neural Networks with Iterative Randomization

Pruning the weights of randomly initialized neural networks plays an imp...
06/14/2019

A Signal Propagation Perspective for Pruning Neural Networks at Initialization

Network pruning is a promising avenue for compressing deep neural networ...
07/05/2021

Connectivity Matters: Neural Network Pruning Through the Lens of Effective Sparsity

Neural network pruning is a fruitful area of research with surging inter...