A note on diffusion limits for stochastic gradient descent

10/20/2022
by   Alberto Lanconelli, et al.
0

In the machine learning literature stochastic gradient descent has recently been widely discussed for its purported implicit regularization properties. Much of the theory, that attempts to clarify the role of noise in stochastic gradient algorithms, has widely approximated stochastic gradient descent by a stochastic differential equation with Gaussian noise. We provide a novel rigorous theoretical justification for this practice that showcases how the Gaussianity of the noise arises naturally.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2023

Asymptotically efficient one-step stochastic gradient descent

A generic, fast and asymptotically efficient method for parametric estim...
research
10/03/2019

Partial differential equation regularization for supervised machine learning

This article is an overview of supervised machine learning problems for ...
research
06/04/2021

Fluctuation-dissipation Type Theorem in Stochastic Linear Learning

The fluctuation-dissipation theorem (FDT) is a simple yet powerful conse...
research
05/22/2018

Efficient Stochastic Gradient Descent for Distributionally Robust Learning

We consider a new stochastic gradient descent algorithm for efficiently ...
research
02/09/2021

Berry–Esseen Bounds for Multivariate Nonlinear Statistics with Applications to M-estimators and Stochastic Gradient Descent Algorithms

We establish a Berry–Esseen bound for general multivariate nonlinear sta...
research
01/25/2022

On Uniform Boundedness Properties of SGD and its Momentum Variants

A theoretical, and potentially also practical, problem with stochastic g...
research
07/01/2022

Analysis of Kinetic Models for Label Switching and Stochastic Gradient Descent

In this paper we provide a novel approach to the analysis of kinetic mod...

Please sign up or login with your details

Forgot password? Click here to reset