The Generalization-Stability Tradeoff in Neural Network Pruning

06/09/2019
by   Brian R. Bartoldson, et al.
0

Pruning neural network parameters to reduce model size is an area of much interest, but the original motivation for pruning was the prevention of overfitting rather than the improvement of computational efficiency. This motivation is particularly relevant given the perhaps surprising observation that a wide variety of pruning approaches confer increases in test accuracy, even when parameter counts are drastically reduced. To better understand this phenomenon, we analyze the behavior of pruning over the course of training, finding that pruning's effect on generalization relies more on the instability generated by pruning than the final size of the pruned model. We demonstrate that even pruning of seemingly unimportant parameters can lead to such instability, allowing our finding to account for the generalization benefits of modern pruning techniques. Our results ultimately suggest that, counter-intuitively, pruning regularizes through instability and mechanisms unrelated to parameter counts.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/14/2020

Quantisation and Pruning for Neural Network Compression and Regularisation

Deep neural networks are typically too computationally expensive to run ...
research
10/25/2022

Pruning's Effect on Generalization Through the Lens of Training and Regularization

Practitioners frequently observe that pruning improves model generalizat...
research
03/02/2023

Average of Pruning: Improving Performance and Stability of Out-of-Distribution Detection

Detecting Out-of-distribution (OOD) inputs have been a critical issue fo...
research
06/24/2021

Sparse Flows: Pruning Continuous-depth Models

Continuous deep learning architectures enable learning of flexible proba...
research
01/26/2021

A Unified Paths Perspective for Pruning at Initialization

A number of recent approaches have been proposed for pruning neural netw...
research
03/10/2021

Robustness to Pruning Predicts Generalization in Deep Neural Networks

Existing generalization measures that aim to capture a model's simplicit...
research
03/06/2020

What is the State of Neural Network Pruning?

Neural network pruning—the task of reducing the size of a network by rem...

Please sign up or login with your details

Forgot password? Click here to reset