On the Inductive Bias of Dropout

12/15/2014
by   David P. Helmbold, et al.
0

Dropout is a simple but effective technique for learning in neural networks and other settings. A sound theoretical understanding of dropout is needed to determine when dropout should be applied and how to use it most effectively. In this paper we continue the exploration of dropout as a regularizer pioneered by Wager, et.al. We focus on linear classification where a convex proxy to the misclassification loss (i.e. the logistic loss used in logistic regression) is minimized. We show: (a) when the dropout-regularized criterion has a unique minimizer, (b) when the dropout-regularization penalty goes to infinity with the weights, and when it remains bounded, (c) that the dropout regularization can be non-monotonic as individual weights increase from 0, and (d) that the dropout regularization penalty may not be convex. This last point is particularly surprising because the combination of dropout regularization with any convex loss proxy is always a convex function. In order to contrast dropout regularization with L_2 regularization, we formalize the notion of when different sources are more compatible with different regularizers. We then exhibit distributions that are provably more compatible with dropout regularization than L_2 regularization, and vice versa. These sources provide additional insight into how the inductive biases of dropout and L_2 regularization differ. We provide some similar results for L_1 regularization.

READ FULL TEXT

page 12

page 14

research
02/14/2016

Surprising properties of dropout in deep networks

We analyze dropout in deep networks with rectified linear units and the ...
research
06/14/2021

The Flip Side of the Reweighted Coin: Duality of Adaptive Dropout and Regularization

Among the most successful methods for sparsifying deep (neural) networks...
research
10/30/2019

On the Regularization Properties of Structured Dropout

Dropout and its extensions (eg. DropBlock and DropConnect) are popular h...
research
10/10/2017

An Analysis of Dropout for Matrix Factorization

Dropout is a simple yet effective algorithm for regularizing neural netw...
research
06/18/2023

Dropout Regularization Versus ℓ_2-Penalization in the Linear Model

We investigate the statistical behavior of gradient descent iterates wit...
research
01/23/2022

Weight Expansion: A New Perspective on Dropout and Generalization

While dropout is known to be a successful regularization technique, insi...
research
07/02/2020

On Dropout, Overfitting, and Interaction Effects in Deep Neural Networks

We examine Dropout through the perspective of interactions: learned effe...

Please sign up or login with your details

Forgot password? Click here to reset