Pruning's Effect on Generalization Through the Lens of Training and Regularization

10/25/2022
by   Tian Jin, et al.
0

Practitioners frequently observe that pruning improves model generalization. A long-standing hypothesis based on bias-variance trade-off attributes this generalization improvement to model size reduction. However, recent studies on over-parameterization characterize a new model size regime, in which larger models achieve better generalization. Pruning models in this over-parameterized regime leads to a contradiction – while theory predicts that reducing model size harms generalization, pruning to a range of sparsities nonetheless improves it. Motivated by this contradiction, we re-examine pruning's effect on generalization empirically. We show that size reduction cannot fully account for the generalization-improving effect of standard pruning algorithms. Instead, we find that pruning leads to better training at specific sparsities, improving the training loss over the dense model. We find that pruning also leads to additional regularization at other sparsities, reducing the accuracy degradation due to noisy examples over the dense model. Pruning extends model training time and reduces model size. These two factors improve training and add regularization respectively. We empirically demonstrate that both factors are essential to fully explaining pruning's impact on generalization.

READ FULL TEXT

page 20

page 30

page 31

page 32

page 33

page 34

page 35

page 39

research
06/09/2019

The Generalization-Stability Tradeoff in Neural Network Pruning

Pruning neural network parameters to reduce model size is an area of muc...
research
08/10/2021

On the Effect of Pruning on Adversarial Robustness

Pruning is a well-known mechanism for reducing the computational cost of...
research
05/04/2018

Enhancing the Regularization Effect of Weight Pruning in Artificial Neural Networks

Artificial neural networks (ANNs) may not be worth their computational/m...
research
08/17/2022

Superior generalization of smaller models in the presence of significant label noise

The benefits of over-parameterization in achieving superior generalizati...
research
05/30/2023

Generalization Bounds for Magnitude-Based Pruning via Sparse Matrix Sketching

In this paper, we derive a novel bound on the generalization error of Ma...
research
12/19/2022

Exploring Optimal Substructure for Out-of-distribution Generalization via Feature-targeted Model Pruning

Recent studies show that even highly biased dense networks contain an un...
research
11/23/2022

Relating Regularization and Generalization through the Intrinsic Dimension of Activations

Given a pair of models with similar training set performance, it is natu...

Please sign up or login with your details

Forgot password? Click here to reset