NodeDrop: A Condition for Reducing Network Size without Effect on Output

06/03/2019
by   Louis Jensen, et al.
0

Determining an appropriate number of features for each layer in a neural network is an important and difficult task. This task is especially important in applications on systems with limited memory or processing power. Many current approaches to reduce network size either utilize iterative procedures, which can extend training time significantly, or require very careful tuning of algorithm parameters to achieve reasonable results. In this paper we propose NodeDrop, a new method for eliminating features in a network. With NodeDrop, we define a condition to identify and guarantee which nodes carry no information, and then use regularization to encourage nodes to meet this condition. We find that NodeDrop drastically reduces the number of features in a network while maintaining high performance, reducing the number of parameters by a factor of 114x for a VGG like network on CIFAR10 without a drop in accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/23/2017

Progressive Learning for Systematic Design of Large Neural Networks

We develop an algorithm for systematic design of a large artificial neur...
research
04/17/2017

Exploring Sparsity in Recurrent Neural Networks

Recurrent Neural Networks (RNN) are widely used to solve a variety of pr...
research
09/16/2023

Reducing Memory Requirements for the IPU using Butterfly Factorizations

High Performance Computing (HPC) benefits from different improvements du...
research
07/30/2015

Multilinear Map Layer: Prediction Regularization by Structural Constraint

In this paper we propose and study a technique to impose structural cons...
research
07/19/2023

TinyTrain: Deep Neural Network Training at the Extreme Edge

On-device training is essential for user personalisation and privacy. Wi...
research
05/18/2023

Less is More! A slim architecture for optimal language translation

The softmax attention mechanism has emerged as a noteworthy development ...

Please sign up or login with your details

Forgot password? Click here to reset