Analysing Dropout and Compounding Errors in Neural Language Models

11/02/2018
by   James O'Neill, et al.
4

This paper carries out an empirical analysis of various dropout techniques for language modelling, such as Bernoulli dropout, Gaussian dropout, Curriculum Dropout, Variational Dropout and Concrete Dropout. Moreover, we propose an extension of variational dropout to concrete dropout and curriculum dropout with varying schedules. We find these extensions to perform well when compared to standard dropout approaches, particularly variational curriculum dropout with a linear schedule. Largest performance increases are made when applying dropout on the decoder layer. Lastly, we analyze where most of the errors occur at test time as a post-analysis step to determine if the well-known problem of compounding errors is apparent and to what end do the proposed methods mitigate this issue for each dataset. We report results on a 2-hidden layer LSTM, GRU and Highway network with embedding dropout, dropout on the gated hidden layers and the output projection layer for each model. We report our results on Penn-TreeBank and WikiText-2 word-level language modelling datasets, where the former reduces the long-tail distribution through preprocessing and one which preserves rare words in the training and test set.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/22/2018

An Exploration of Dropout with RNNs for Natural Language Inference

Dropout is a crucial regularization technique for the Recurrent Neural N...
research
07/11/2014

Altitude Training: Strong Bounds for Single-Layer Dropout

Dropout training, originally designed for deep neural networks, has been...
research
12/16/2015

A Theoretically Grounded Application of Dropout in Recurrent Neural Networks

Recurrent neural networks (RNNs) stand at the forefront of many recent d...
research
01/19/2017

Variational Dropout Sparsifies Deep Neural Networks

We explore a recently proposed Variational Dropout technique that provid...
research
03/18/2017

Curriculum Dropout

Dropout is a very effective way of regularizing neural networks. Stochas...
research
08/12/2015

Bayesian Dropout

Dropout has recently emerged as a powerful and simple method for trainin...
research
04/12/2019

Reliable Prediction Errors for Deep Neural Networks Using Test-Time Dropout

While the use of deep learning in drug discovery is gaining increasing a...

Please sign up or login with your details

Forgot password? Click here to reset