Analysis of dropout learning regarded as ensemble learning

06/20/2017
by   Kazuyuki Hara, et al.
0

Deep learning is the state-of-the-art in fields such as visual object recognition and speech recognition. This learning uses a large number of layers, huge number of units, and connections. Therefore, overfitting is a serious problem. To avoid this problem, dropout learning is proposed. Dropout learning neglects some inputs and hidden units in the learning process with a probability, p, and then, the neglected inputs and hidden units are combined with the learned network to express the final output. We find that the process of combining the neglected hidden units with the learned network can be regarded as ensemble learning, so we analyze dropout learning from this point of view.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/09/2017

Analysis of Dropout in Online Learning

Deep learning is the state-of-the-art in fields such as visual object re...
research
06/26/2018

On the Implicit Bias of Dropout

Algorithmic approaches endow deep learning systems with implicit bias th...
research
11/18/2016

Compacting Neural Network Classifiers via Dropout Training

We introduce dropout compaction, a novel method for training feed-forwar...
research
03/22/2022

Clustering units in neural networks: upstream vs downstream information

It has been hypothesized that some form of "modular" structure in artifi...
research
05/20/2016

Swapout: Learning an ensemble of deep architectures

We describe Swapout, a new stochastic training method, that outperforms ...
research
05/31/2019

Learning Sparse Networks Using Targeted Dropout

Neural networks are easier to optimise when they have many more weights ...
research
08/14/2021

Investigating the Relationship Between Dropout Regularization and Model Complexity in Neural Networks

Dropout Regularization, serving to reduce variance, is nearly ubiquitous...

Please sign up or login with your details

Forgot password? Click here to reset