Revisiting hard thresholding for DNN pruning

05/21/2019
by   Konstantinos Pitas, et al.
4

The most common method for DNN pruning is hard thresholding of network weights, followed by retraining to recover any lost accuracy. Recently developed smart pruning algorithms use the DNN response over the training set for a variety of cost functions to determine redundant network weights, leading to less accuracy degradation and possibly less retraining time. For experiments on the total pruning time (pruning time + retraining time) we show that hard thresholding followed by retraining remains the most efficient way of reducing the number of network parameters. However smart pruning algorithms still have advantages when retraining is not possible. In this context we propose a novel smart pruning algorithm based on difference of convex functions optimisation and show that it is often orders of magnitude faster than competing approaches while achieving the lowest classification accuracy degradation. Furthermore we investigate theoretically the effect of hard thresholding on DNN accuracy. We show that accuracy degradation increases with remaining network depth from the pruned layer. We also discover a link between the latent dimensionality of the training data manifold and network robustness to hard thresholding.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/12/2018

FeTa: A DCA Pruning Algorithm with Generalization Error Guarantees

Recent DNN pruning algorithms have succeeded in reducing the number of p...
research
09/11/2020

Achieving Adversarial Robustness via Sparsity

Network pruning has been known to produce compact models without much ac...
research
06/06/2019

(Pen-) Ultimate DNN Pruning

DNN pruning reduces memory footprint and computational work of DNN-based...
research
03/10/2021

Manifold Regularized Dynamic Network Pruning

Neural network pruning is an essential approach for reducing the computa...
research
12/07/2021

i-SpaSP: Structured Neural Pruning via Sparse Signal Recovery

We propose a novel, structured pruning algorithm for neural networks – t...
research
09/22/2020

Sanity-Checking Pruning Methods: Random Tickets can Win the Jackpot

Network pruning is a method for reducing test-time computational resourc...
research
09/10/2021

Dynamic Collective Intelligence Learning: Finding Efficient Sparse Model via Refined Gradients for Pruned Weights

With the growth of deep neural networks (DNN), the number of DNN paramet...

Please sign up or login with your details

Forgot password? Click here to reset