Asymptotically efficient one-step stochastic gradient descent

06/09/2023
by   Alain Bensoussan, et al.
0

A generic, fast and asymptotically efficient method for parametric estimation is described. It is based on the stochastic gradient descent on the loglikelihood function corrected by a single step of the Fisher scoring algorithm. We show theoretically and by simulations in the i.i.d. setting that it is an interesting alternative to the usual stochastic gradient descent with averaging or the adaptative stochastic gradient descent.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/16/2017

Conditional Accelerated Lazy Stochastic Gradient Descent

In this work we introduce a conditional accelerated lazy stochastic grad...
research
10/20/2022

A note on diffusion limits for stochastic gradient descent

In the machine learning literature stochastic gradient descent has recen...
research
09/10/2023

Is Learning in Biological Neural Networks based on Stochastic Gradient Descent? An analysis using stochastic processes

In recent years, there has been an intense debate about how learning in ...
research
11/22/2011

Stochastic gradient descent on Riemannian manifolds

Stochastic gradient descent is a simple approach to find the local minim...
research
02/22/2018

Iterate averaging as regularization for stochastic gradient descent

We propose and analyze a variant of the classic Polyak-Ruppert averaging...
research
12/22/2017

True Asymptotic Natural Gradient Optimization

We introduce a simple algorithm, True Asymptotic Natural Gradient Optimi...
research
04/10/2021

SGD Implicitly Regularizes Generalization Error

We derive a simple and model-independent formula for the change in the g...

Please sign up or login with your details

Forgot password? Click here to reset