Continuous Dropout

11/28/2019
by   Xu Shen, et al.
16

Dropout has been proven to be an effective algorithm for training robust deep networks because of its ability to prevent overfitting by avoiding the co-adaptation of feature detectors. Current explanations of dropout include bagging, naive Bayes, regularization, and sex in evolution. According to the activation patterns of neurons in the human brain, when faced with different situations, the firing rates of neurons are random and continuous, not binary as current dropout does. Inspired by this phenomenon, we extend the traditional binary dropout to continuous dropout. On the one hand, continuous dropout is considerably closer to the activation characteristics of neurons in the human brain than traditional binary dropout. On the other hand, we demonstrate that continuous dropout has the property of avoiding the co-adaptation of feature detectors, which suggests that we can extract more independent feature detectors for model averaging in the test stage. We introduce the proposed continuous dropout to a feedforward neural network and comprehensively compare it with binary dropout, adaptive dropout, and DropConnect on MNIST, CIFAR-10, SVHN, NORB, and ILSVRC-12. Thorough experiments demonstrate that our method performs better in preventing the co-adaptation of feature detectors and improves test performance. The code is available at: https://github.com/jasonustc/caffe-multigpu/tree/dropout.

READ FULL TEXT

page 1

page 5

page 9

research
08/29/2018

Dropout with Tabu Strategy for Regularizing Deep Neural Networks

Dropout has proven to be an effective technique for regularization and p...
research
07/03/2012

Improving neural networks by preventing co-adaptation of feature detectors

When a large feedforward neural network is trained on a small training s...
research
11/18/2019

RotationOut as a Regularization Method for Neural Network

In this paper, we propose a novel regularization method, RotationOut, fo...
research
03/02/2023

Dropout Reduces Underfitting

Introduced by Hinton et al. in 2012, dropout has stood the test of time ...
research
08/13/2019

Neural Plasticity Networks

Neural plasticity is an important functionality of human brain, in which...
research
03/18/2017

Curriculum Dropout

Dropout is a very effective way of regularizing neural networks. Stochas...
research
09/04/2023

Dropout Attacks

Dropout is a common operator in deep learning, aiming to prevent overfit...

Please sign up or login with your details

Forgot password? Click here to reset