Neuron-Specific Dropout: A Deterministic Regularization Technique to Prevent Neural Networks from Overfitting Reduce Dependence on Large Training Samples

01/13/2022
by   Joshua Shunk, et al.
0

In order to develop complex relationships between their inputs and outputs, deep neural networks train and adjust large number of parameters. To make these networks work at high accuracy, vast amounts of data are needed. Sometimes, however, the quantity of data needed is not present or obtainable for training. Neuron-specific dropout (NSDropout) is a tool to address this problem. NSDropout looks at both the training pass, and validation pass, of a layer in a model. By comparing the average values produced by each neuron for each class in a data set, the network is able to drop targeted units. The layer is able to predict what features, or noise, the model is looking at during testing that isn't present when looking at samples from validation. Unlike dropout, the "thinned" networks cannot be "unthinned" for testing. Neuron-specific dropout has proved to achieve similar, if not better, testing accuracy with far less data than traditional methods including dropout and other regularization methods. Experimentation has shown that neuron-specific dropout reduces the chance of a network overfitting and reduces the need for large training samples on supervised learning tasks in image recognition, all while producing best-in-class results.

READ FULL TEXT

page 10

page 12

research
02/07/2019

Ising-Dropout: A Regularization Method for Training and Compression of Deep Neural Networks

Overfitting is a major problem in training machine learning models, spec...
research
12/21/2015

GraphConnect: A Regularization Framework for Neural Networks

Deep neural networks have proved very successful in domains where large ...
research
10/11/2021

Disturbing Target Values for Neural Network Regularization

Diverse regularization techniques have been developed such as L2 regular...
research
05/28/2018

Adaptive Network Sparsification via Dependent Variational Beta-Bernoulli Dropout

While variational dropout approaches have been shown to be effective for...
research
11/30/2017

Differentially Private Variational Dropout

Deep neural networks with their large number of parameters are highly fl...
research
05/20/2016

Swapout: Learning an ensemble of deep architectures

We describe Swapout, a new stochastic training method, that outperforms ...
research
07/03/2012

Improving neural networks by preventing co-adaptation of feature detectors

When a large feedforward neural network is trained on a small training s...

Please sign up or login with your details

Forgot password? Click here to reset