Lost in Pruning: The Effects of Pruning Neural Networks beyond Test Accuracy

03/04/2021
by   Lucas Liebenwein, et al.
28

Neural network pruning is a popular technique used to reduce the inference costs of modern, potentially overparameterized, networks. Starting from a pre-trained network, the process is as follows: remove redundant parameters, retrain, and repeat while maintaining the same test accuracy. The result is a model that is a fraction of the size of the original with comparable predictive performance (test accuracy). Here, we reassess and evaluate whether the use of test accuracy alone in the terminating condition is sufficient to ensure that the resulting model performs well across a wide spectrum of "harder" metrics such as generalization to out-of-distribution data and resilience to noise. Across evaluations on varying architectures and data sets, we find that pruned networks effectively approximate the unpruned model, however, the prune ratio at which pruned networks achieve commensurate performance varies significantly across tasks. These results call into question the extent of genuine overparameterization in deep learning and raise concerns about the practicability of deploying pruned networks, specifically in the context of safety-critical systems, unless they are widely evaluated beyond test accuracy to reliably predict their performance. Our code is available at https://github.com/lucaslie/torchprune.

READ FULL TEXT

page 19

page 20

page 21

page 32

page 33

page 35

page 41

page 42

research
02/24/2020

On Pruning Adversarially Robust Neural Networks

In safety-critical but computationally resource-constrained applications...
research
03/19/2021

Toward Compact Deep Neural Networks via Energy-Aware Pruning

Despite of the remarkable performance, modern deep neural networks are i...
research
06/22/2021

Randomness In Neural Network Training: Characterizing The Impact of Tooling

The quest for determinism in machine learning has disproportionately foc...
research
06/25/2019

Importance Estimation for Neural Network Pruning

Structural pruning of neural network parameters reduces computation, ene...
research
06/09/2023

How Sparse Can We Prune A Deep Network: A Geometric Viewpoint

Overparameterization constitutes one of the most significant hallmarks o...
research
08/30/2022

DLDNN: Deterministic Lateral Displacement Design Automation by Neural Networks

Size-based separation of bioparticles/cells is crucial to a variety of b...
research
07/26/2019

Multi-Stage Prediction Networks for Data Harmonization

In this paper, we introduce multi-task learning (MTL) to data harmonizat...

Please sign up or login with your details

Forgot password? Click here to reset