Guided Dropout

12/10/2018
by   Rohit Keshari, et al.
26

Dropout is often used in deep neural networks to prevent over-fitting. Conventionally, dropout training invokes random drop of nodes from the hidden layers of a Neural Network. It is our hypothesis that a guided selection of nodes for intelligent dropout can lead to better generalization as compared to the traditional dropout. In this research, we propose "guided dropout" for training deep neural network which drop nodes by measuring the strength of each node. We also demonstrate that conventional dropout is a specific case of the proposed guided dropout. Experimental evaluation on multiple datasets including MNIST, CIFAR10, CIFAR100, SVHN, and Tiny ImageNet demonstrate the efficacy of the proposed guided dropout.

READ FULL TEXT

page 2

page 6

research
04/25/2019

Survey of Dropout Methods for Deep Neural Networks

Dropout methods are a family of stochastic techniques used in neural net...
research
02/16/2014

Dropout Rademacher Complexity of Deep Neural Networks

Great successes of deep neural networks have been witnessed in various r...
research
05/23/2018

Excitation Dropout: Encouraging Plasticity in Deep Neural Networks

We propose a guided dropout regularizer for deep networks based on the e...
research
01/22/2018

The Hybrid Bootstrap: A Drop-in Replacement for Dropout

Regularization is an important component of predictive model building. T...
research
10/13/2021

Dropout Prediction Variation Estimation Using Neuron Activation Strength

It is well-known DNNs would generate different prediction results even g...
research
09/04/2023

Dropout Attacks

Dropout is a common operator in deep learning, aiming to prevent overfit...
research
01/30/2018

Fast Power system security analysis with Guided Dropout

We propose a new method to efficiently compute load-flows (the steady-st...

Please sign up or login with your details

Forgot password? Click here to reset