Pruning neural networks: is it time to nip it in the bud?

10/10/2018
by   Elliot J. Crowley, et al.
6

Pruning is a popular technique for compressing a neural network: a large pre-trained network is fine-tuned while connections are successively removed. However, the value of pruning has largely evaded scrutiny. In this extended abstract, we examine residual networks obtained through Fisher-pruning and make two interesting observations. First, when time-constrained, it is better to train a simple, smaller network from scratch than prune a large network. Second, it is the architectures obtained through the pruning process --- not the learnt weights ---that prove valuable. Such architectures are powerful when trained from scratch. Furthermore, these architectures are easy to approximate without any further pruning: we can prune once and obtain a family of new, scalable network architectures for different memory requirements.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/15/2021

An Experimental Study of the Impact of Pre-training on the Pruning of a Convolutional Neural Network

In recent years, deep neural networks have known a wide success in vario...
research
01/26/2019

PruneTrain: Gradual Structured Pruning from Scratch for Faster Neural Network Training

Model pruning is a popular mechanism to make a network more efficient fo...
research
10/11/2018

Rethinking the Value of Network Pruning

Network pruning is widely used for reducing the heavy computational cost...
research
08/28/2021

ThresholdNet: Pruning Tool for Densely Connected Convolutional Networks

Deep neural networks have made significant progress in the field of comp...
research
09/27/2019

Pruning from Scratch

Network pruning is an important research field aiming at reducing comput...
research
06/16/2023

Magnificent Minified Models

This paper concerns itself with the task of taking a large trained neura...
research
05/14/2020

Dynamic Sparse Training: Find Efficient Sparse Network From Scratch With Trainable Masked Layers

We present a novel network pruning algorithm called Dynamic Sparse Train...

Please sign up or login with your details

Forgot password? Click here to reset