Network Trimming: A Data-Driven Neuron Pruning Approach towards Efficient Deep Architectures

07/12/2016
by   Hengyuan Hu, et al.
0

State-of-the-art neural networks are getting deeper and wider. While their performance increases with the increasing number of layers and neurons, it is crucial to design an efficient deep architecture in order to reduce computational and memory costs. Designing an efficient neural network, however, is labor intensive requiring many experiments, and fine-tunings. In this paper, we introduce network trimming which iteratively optimizes the network by pruning unimportant neurons based on analysis of their outputs on a large dataset. Our algorithm is inspired by an observation that the outputs of a significant portion of neurons in a large network are mostly zero, regardless of what inputs the network received. These zero activation neurons are redundant, and can be removed without affecting the overall accuracy of the network. After pruning the zero activation neurons, we retrain the network using the weights before pruning as initialization. We alternate the pruning and retraining to further reduce zero activations in a network. Our experiments on the LeNet and VGG-16 show that we can achieve high compression ratio of parameters without losing or even achieving higher accuracy than the original network.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/18/2016

NoiseOut: A Simple Way to Prune Neural Networks

Neural networks are usually over-parameterized with significant redundan...
research
07/21/2017

Neuron Pruning for Compressing Deep Networks using Maxout Architectures

This paper presents an efficient and robust approach for reducing the si...
research
02/28/2018

Compressing Neural Networks using the Variational Information Bottleneck

Neural networks can be compressed to reduce memory and computational req...
research
11/19/2016

Learning the Number of Neurons in Deep Networks

Nowadays, the number of layers and of neurons in each layer of a deep ne...
research
01/06/2020

Investigation and Analysis of Hyper and Hypo neuron pruning to selectively update neurons during Unsupervised Adaptation

Unseen or out-of-domain data can seriously degrade the performance of a ...
research
11/16/2015

Diversity Networks: Neural Network Compression Using Determinantal Point Processes

We introduce Divnet, a flexible technique for learning networks with div...
research
02/17/2022

When, where, and how to add new neurons to ANNs

Neurogenesis in ANNs is an understudied and difficult problem, even comp...

Please sign up or login with your details

Forgot password? Click here to reset