DropNet: Reducing Neural Network Complexity via Iterative Pruning

07/14/2022
by   John Tan Chong Min, et al.
0

Modern deep neural networks require a significant amount of computing time and power to train and deploy, which limits their usage on edge devices. Inspired by the iterative weight pruning in the Lottery Ticket Hypothesis, we propose DropNet, an iterative pruning method which prunes nodes/filters to reduce network complexity. DropNet iteratively removes nodes/filters with the lowest average post-activation value across all training samples. Empirically, we show that DropNet is robust across diverse scenarios, including MLPs and CNNs using the MNIST, CIFAR-10 and Tiny ImageNet datasets. We show that up to 90 accuracy. The final pruned network performs well even with reinitialization of the weights and biases. DropNet also has similar accuracy to an oracle which greedily removes nodes/filters one at a time to minimise training loss, highlighting its effectiveness.

READ FULL TEXT

page 4

page 12

research
03/05/2020

Pruning Filters while Training for Efficiently Optimizing Deep Learning Networks

Modern deep networks have millions to billions of parameters, which lead...
research
02/15/2022

Convolutional Network Fabric Pruning With Label Noise

This paper presents an iterative pruning strategy for Convolutional Netw...
research
12/11/2021

Achieving Low Complexity Neural Decoders via Iterative Pruning

The advancement of deep learning has led to the development of neural de...
research
03/28/2023

Randomly Initialized Subnetworks with Iterative Weight Recycling

The Multi-Prize Lottery Ticket Hypothesis posits that randomly initializ...
research
04/30/2021

Post-training deep neural network pruning via layer-wise calibration

We present a post-training weight pruning method for deep neural network...
research
03/21/2023

Protective Self-Adaptive Pruning to Better Compress DNNs

Adaptive network pruning approach has recently drawn significant attenti...
research
09/03/2021

Using Topological Framework for the Design of Activation Function and Model Pruning in Deep Neural Networks

Success of deep neural networks in diverse tasks across domains of compu...

Please sign up or login with your details

Forgot password? Click here to reset