Altitude Training: Strong Bounds for Single-Layer Dropout

07/11/2014
by   Stefan Wager, et al.
0

Dropout training, originally designed for deep neural networks, has been successful on high-dimensional single-layer natural language tasks. This paper proposes a theoretical explanation for this phenomenon: we show that, under a generative Poisson topic model with long documents, dropout training improves the exponent in the generalization bound for empirical risk minimization. Dropout achieves this gain much like a marathon runner who practices at altitude: once a classifier learns to perform reasonably well on training examples that have been artificially corrupted by dropout, it will do very well on the uncorrupted test set. We also show that, under similar conditions, dropout preserves the Bayes decision boundary and should therefore induce minimal bias in high dimensions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/02/2018

Analysing Dropout and Compounding Errors in Neural Language Models

This paper carries out an empirical analysis of various dropout techniqu...
research
10/23/2020

On Convergence and Generalization of Dropout Training

We study dropout in two-layer neural networks with rectified linear unit...
research
03/21/2016

Data Augmentation via Levy Processes

If a document is about travel, we may expect that short snippets of the ...
research
05/29/2018

Deep Learning under Privileged Information Using Heteroscedastic Dropout

Unlike machines, humans learn through rapid, abstract model-building. Th...
research
06/26/2018

On the Implicit Bias of Dropout

Algorithmic approaches endow deep learning systems with implicit bias th...
research
02/26/2022

Dropout can Simulate Exponential Number of Models for Sample Selection Techniques

Following Coteaching, generally in the literature, two models are used i...
research
05/23/2018

Pushing the bounds of dropout

We show that dropout training is best understood as performing MAP estim...

Please sign up or login with your details

Forgot password? Click here to reset