Lost in Pruning: The Effects of Pruning Neural Networks beyond Test Accuracy

03/04/2021
by   Lucas Liebenwein, et al.
28

Neural network pruning is a popular technique used to reduce the inference costs of modern, potentially overparameterized, networks. Starting from a pre-trained network, the process is as follows: remove redundant parameters, retrain, and repeat while maintaining the same test accuracy. The result is a model that is a fraction of the size of the original with comparable predictive performance (test accuracy). Here, we reassess and evaluate whether the use of test accuracy alone in the terminating condition is sufficient to ensure that the resulting model performs well across a wide spectrum of "harder" metrics such as generalization to out-of-distribution data and resilience to noise. Across evaluations on varying architectures and data sets, we find that pruned networks effectively approximate the unpruned model, however, the prune ratio at which pruned networks achieve commensurate performance varies significantly across tasks. These results call into question the extent of genuine overparameterization in deep learning and raise concerns about the practicability of deploying pruned networks, specifically in the context of safety-critical systems, unless they are widely evaluated beyond test accuracy to reliably predict their performance. Our code is available at https://github.com/lucaslie/torchprune.

READ FULL TEXT

page 19

page 20

page 21

page 32

page 33

page 35

page 41

page 42

02/24/2020

On Pruning Adversarially Robust Neural Networks

In safety-critical but computationally resource-constrained applications...
03/19/2021

Toward Compact Deep Neural Networks via Energy-Aware Pruning

Despite of the remarkable performance, modern deep neural networks are i...
06/22/2021

Randomness In Neural Network Training: Characterizing The Impact of Tooling

The quest for determinism in machine learning has disproportionately foc...
06/25/2019

Importance Estimation for Neural Network Pruning

Structural pruning of neural network parameters reduces computation, ene...
06/15/2022

Can pruning improve certified robustness of neural networks?

With the rapid development of deep learning, the sizes of neural network...
11/11/2021

AlphaGarden: Learning to Autonomously Tend a Polyculture Garden

This paper presents AlphaGarden: an autonomous polyculture garden that p...
07/26/2019

Multi-Stage Prediction Networks for Data Harmonization

In this paper, we introduce multi-task learning (MTL) to data harmonizat...