Dropout Regularization Versus ℓ_2-Penalization in the Linear Model

06/18/2023
by   Gabriel Clara, et al.
0

We investigate the statistical behavior of gradient descent iterates with dropout in the linear regression model. In particular, non-asymptotic bounds for expectations and covariance matrices of the iterates are derived. In contrast with the widely cited connection between dropout and ℓ_2-regularization in expectation, the results indicate a much more subtle relationship, owing to interactions between the gradient descent dynamics and the additional randomness induced by dropout. We also study a simplified variant of dropout which does not have a regularizing effect and converges to the least squares estimator.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/28/2020

The Implicit and Explicit Regularization Effects of Dropout

Dropout is a widely-used regularization technique, often required to obt...
research
07/13/2022

Implicit regularization of dropout

It is important to understand how the popular regularization method drop...
research
05/25/2023

Dropout Drops Double Descent

In this paper, we find and analyze that we can easily drop the double de...
research
12/15/2014

On the Inductive Bias of Dropout

Dropout is a simple but effective technique for learning in neural netwo...
research
05/11/2023

Dropout Regularization in Extended Generalized Linear Models based on Double Exponential Families

Even though dropout is a popular regularization technique, its theoretic...
research
07/13/2020

Regularized linear autoencoders recover the principal components, eventually

Our understanding of learning input-output relationships with neural net...
research
08/04/2022

Spectral Universality of Regularized Linear Regression with Nearly Deterministic Sensing Matrices

It has been observed that the performances of many high-dimensional esti...

Please sign up or login with your details

Forgot password? Click here to reset