A Bayesian Variational Framework for Stochastic Optimization

05/05/2019
by   Philippe Casgrain, et al.
0

This work proposes a theoretical framework for stochastic optimization algorithms, based on a continuous Bayesian variational model for algorithms. Using techniques from stochastic control with asymmetric information, the solution to this variational problem is shown to be equivalent to a system of Forward Backward Differential Equations (FBSDEs). Using an analytical approximation to the solution of these FBSDEs, we recover a variety of existing adaptive stochastic gradient descent methods. This framework establishes a direct connection between stochastic optimization algorithms and a secondary Bayesian inference problem on gradients, where the prior and assumed observation dynamics determine the resulting algorithm.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/05/2019

A Latent Variational Framework for Stochastic Optimization

This paper provides a unifying theoretical framework for stochastic opti...
research
06/20/2017

A Unified Approach to Adaptive Regularization in Online and Stochastic Optimization

We describe a framework for deriving and analyzing online optimization a...
research
05/28/2021

Polygonal Unadjusted Langevin Algorithms: Creating stable and efficient adaptive algorithms for neural networks

We present a new class of adaptive stochastic optimization algorithms, w...
research
11/20/2021

Bayesian Learning via Neural Schrödinger-Föllmer Flows

In this work we explore a new framework for approximate Bayesian inferen...
research
05/22/2017

Follow the Signs for Robust Stochastic Optimization

Stochastic noise on gradients is now a common feature in machine learnin...
research
09/01/2020

Robust, Accurate Stochastic Optimization for Variational Inference

We consider the problem of fitting variational posterior approximations ...

Please sign up or login with your details

Forgot password? Click here to reset