Methods for Pruning Deep Neural Networks

10/31/2020
by   Sunil Vadera, et al.
0

This paper presents a survey of methods for pruning deep neural networks, from algorithms first proposed for fully connected networks in the 1990s to the recent methods developed for reducing the size of convolutional neural networks. The paper begins by bringing together many different algorithms by categorising them based on the underlying approach used. It then focuses on three categories: methods that use magnitude-based pruning, methods that utilise clustering to identify redundancy, and methods that utilise sensitivity analysis. Some of the key influencing studies within these categories are presented to illuminate the underlying approaches and results achieved. Most studies on pruning present results from empirical evaluations, which are distributed in the literature as new architectures, algorithms and data sets have evolved with time. This paper brings together the reported results from some key papers in one place by providing a resource that can be used to quickly compare reported results, and trace studies where specific methods, data sets and architectures have been used.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/06/2020

What is the State of Neural Network Pruning?

Neural network pruning—the task of reducing the size of a network by rem...
research
05/20/2021

A Probabilistic Approach to Neural Network Pruning

Neural network pruning techniques reduce the number of parameters withou...
research
10/28/2021

An Operator Theoretic Perspective on Pruning Deep Neural Networks

The discovery of sparse subnetworks that are able to perform as well as ...
research
10/25/2022

Learning Ability of Interpolating Convolutional Neural Networks

It is frequently observed that overparameterized neural networks general...
research
10/24/2019

A Comparative Study of Neural Network Compression

There has recently been an increasing desire to evaluate neural networks...
research
06/15/2018

Detecting Dead Weights and Units in Neural Networks

Deep Neural Networks are highly over-parameterized and the size of the n...
research
07/03/2003

Generation of Explicit Knowledge from Empirical Data through Pruning of Trainable Neural Networks

This paper presents a generalized technology of extraction of explicit k...

Please sign up or login with your details

Forgot password? Click here to reset