A Latent Variational Framework for Stochastic Optimization

05/05/2019
by   Philippe Casgrain, et al.
0

This paper provides a unifying theoretical framework for stochastic optimization algorithms by means of a latent stochastic variational problem. Using techniques from stochastic control, the solution to the variational problem is shown to be equivalent to that of a Forward Backward Stochastic Differential Equation (FBSDE). By solving these equations, we recover a variety of existing adaptive stochastic gradient descent methods. This framework establishes a direct connection between stochastic optimization algorithms and a secondary Bayesian inference problem on gradients, where a prior measure on noisy gradient observations determine the resulting algorithm.

READ FULL TEXT
research
05/05/2019

A Bayesian Variational Framework for Stochastic Optimization

This work proposes a theoretical framework for stochastic optimization a...
research
05/22/2017

Follow the Signs for Robust Stochastic Optimization

Stochastic noise on gradients is now a common feature in machine learnin...
research
10/27/2017

SGDLibrary: A MATLAB library for stochastic gradient descent algorithms

We consider the problem of finding the minimizer of a function f: R^d →R...
research
06/20/2017

A Unified Approach to Adaptive Regularization in Online and Stochastic Optimization

We describe a framework for deriving and analyzing online optimization a...
research
03/29/2017

Probabilistic Line Searches for Stochastic Optimization

In deterministic optimization, line searches are a standard tool ensurin...
research
04/02/2022

Application of Stochastic Optimization Techniques to the Unit Commitment Problem – A Review

Due to the established energy production methods contribution to the cli...

Please sign up or login with your details

Forgot password? Click here to reset