Understanding Dropout: Training Multi-Layer Perceptrons with Auxiliary Independent Stochastic Neurons

06/12/2013
by   Kyunghyun Cho, et al.
0

In this paper, a simple, general method of adding auxiliary stochastic neurons to a multi-layer perceptron is proposed. It is shown that the proposed method is a generalization of recently successful methods of dropout (Hinton et al., 2012), explicit noise injection (Vincent et al., 2010; Bishop, 1995) and semantic hashing (Salakhutdinov & Hinton, 2009). Under the proposed framework, an extension of dropout which allows using separate dropping probabilities for different hidden neurons, or layers, is found to be available. The use of different dropping probabilities for hidden layers separately is empirically investigated.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/02/2022

Parameter efficient dendritic-tree neurons outperform perceptrons

Biological neurons are more powerful than artificial perceptrons, in par...
research
08/10/2018

Dropout is a special case of the stochastic delta rule: faster and more accurate deep learning

Multi-layer neural networks have lead to remarkable performance on many ...
research
02/06/2016

Improved Dropout for Shallow and Deep Learning

Dropout has been witnessed with great success in training deep neural ne...
research
03/01/2010

Deep Big Simple Neural Nets Excel on Handwritten Digit Recognition

Good old on-line back-propagation for plain multi-layer perceptrons yiel...
research
11/08/2019

Stacked dense optical flows and dropout layers to predict sperm motility and morphology

In this paper, we analyse two deep learning methods to predict sperm mot...
research
07/18/2023

Can Neural Network Memorization Be Localized?

Recent efforts at explaining the interplay of memorization and generaliz...
research
12/25/2018

Dropout Regularization in Hierarchical Mixture of Experts

Dropout is a very effective method in preventing overfitting and has bec...

Please sign up or login with your details

Forgot password? Click here to reset