Analysis of Dropout in Online Learning

11/09/2017
by   Kazuyuki Hara, et al.
0

Deep learning is the state-of-the-art in fields such as visual object recognition and speech recognition. This learning uses a large number of layers and a huge number of units and connections. Therefore, overfitting is a serious problem with it, and the dropout which is a kind of regularization tool is used. However, in online learning, the effect of dropout is not well known. This paper presents our investigation on the effect of dropout in online learning. We analyzed the effect of dropout on convergence speed near the singular point. Our results indicated that dropout is effective in online learning. Dropout tends to avoid the singular point for convergence speed near that point.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/20/2017

Analysis of dropout learning regarded as ensemble learning

Deep learning is the state-of-the-art in fields such as visual object re...
research
07/04/2013

Dropout Training as Adaptive Regularization

Dropout and other feature noising schemes control overfitting by artific...
research
03/09/2023

Aux-Drop: Handling Haphazard Inputs in Online Learning Using Auxiliary Dropouts

Many real-world applications based on online learning produce streaming ...
research
12/29/2022

Macro-block dropout for improved regularization in training end-to-end speech recognition models

This paper proposes a new regularization algorithm referred to as macro-...
research
09/28/2018

Reconciling Feature-Reuse and Overfitting in DenseNet with Specialized Dropout

Recently convolutional neural networks (CNNs) achieve great accuracy in ...
research
06/26/2018

On the Implicit Bias of Dropout

Algorithmic approaches endow deep learning systems with implicit bias th...
research
11/28/2020

Optical Phase Dropout in Diffractive Deep Neural Network

Unitary learning is a backpropagation that serves to unitary weights upd...

Please sign up or login with your details

Forgot password? Click here to reset