SGD Implicitly Regularizes Generalization Error

04/10/2021
by   Daniel A. Roberts, et al.
0

We derive a simple and model-independent formula for the change in the generalization gap due to a gradient descent update. We then compare the change in the test error for stochastic gradient descent to the change in test error from an equivalent number of gradient descent updates and show explicitly that stochastic gradient descent acts to regularize generalization error by decorrelating nearby updates. These calculations depends on the details of the model only through the mean and covariance of the gradient distribution, which may be readily measured for particular models of interest. We discuss further improvements to these calculations and comment on possible implications for stochastic optimization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2023

Asymptotically efficient one-step stochastic gradient descent

A generic, fast and asymptotically efficient method for parametric estim...
research
09/22/2021

On the equivalence of different adaptive batch size selection strategies for stochastic gradient descent methods

In this study, we demonstrate that the norm test and inner product/ortho...
research
06/28/2022

Studying Generalization Through Data Averaging

The generalization of machine learning models has a complex dependence o...
research
06/25/2021

Assessing Generalization of SGD via Disagreement

We empirically show that the test error of deep networks can be estimate...
research
03/07/2018

Sever: A Robust Meta-Algorithm for Stochastic Optimization

In high dimensions, most machine learning methods are brittle to even a ...
research
09/25/2022

Stochastic Gradient Descent Captures How Children Learn About Physics

As children grow older, they develop an intuitive understanding of the p...
research
05/21/2018

Measuring and regularizing networks in function space

Neural network optimization is often conceptualized as optimizing parame...

Please sign up or login with your details

Forgot password? Click here to reset