On Fast Dropout and its Applicability to Recurrent Networks

11/04/2013
by   Justin Bayer, et al.
0

Recurrent Neural Networks (RNNs) are rich models for the processing of sequential data. Recent work on advancing the state of the art has been focused on the optimization or modelling of RNNs, mostly motivated by adressing the problems of the vanishing and exploding gradients. The control of overfitting has seen considerably less attention. This paper contributes to that by analyzing fast dropout, a recent regularization method for generalized linear models and neural networks from a back-propagation inspired perspective. We show that fast dropout implements a quadratic form of an adaptive, per-parameter regularizer, which rewards large weights in the light of underfitting, penalizes them for overconfident predictions and vanishes at minima of an unregularized training loss. The derivatives of that regularizer are exclusively based on the training error signal. One consequence of this is the absense of a global weight attractor, which is particularly appealing for RNNs, since the dynamics are not biased towards a certain regime. We positively test the hypothesis that this improves the performance of RNNs on four musical data sets.

READ FULL TEXT
research
04/22/2019

Adversarial Dropout for Recurrent Neural Networks

Successful application processing sequential data, such as text and spee...
research
10/21/2014

Regularizing Recurrent Networks - On Injected Noise and Norm-based Methods

Advancements in parallel processing have lead to a surge in multilayer p...
research
07/04/2013

Dropout Training as Adaptive Regularization

Dropout and other feature noising schemes control overfitting by artific...
research
10/31/2017

Fraternal Dropout

Recurrent neural networks (RNNs) are important class of architectures am...
research
08/22/2019

RNNs Evolving in Equilibrium: A Solution to the Vanishing and Exploding Gradients

Recurrent neural networks (RNNs) are particularly well-suited for modeli...
research
07/02/2020

On Dropout, Overfitting, and Interaction Effects in Deep Neural Networks

We examine Dropout through the perspective of interactions: learned effe...
research
12/21/2013

An empirical analysis of dropout in piecewise linear networks

The recently introduced dropout training criterion for neural networks h...

Please sign up or login with your details

Forgot password? Click here to reset