A Complete Recipe for Stochastic Gradient MCMC

06/15/2015
by   Yi-an Ma, et al.
0

Many recent Markov chain Monte Carlo (MCMC) samplers leverage continuous dynamics to define a transition kernel that efficiently explores a target distribution. In tandem, a focus has been on devising scalable variants that subsample the data and use stochastic gradients in place of full-data gradients in the dynamic simulations. However, such stochastic gradient MCMC samplers have lagged behind their full-data counterparts in terms of the complexity of dynamics considered since proving convergence in the presence of the stochastic gradient noise is non-trivial. Even with simple dynamics, significant physical intuition is often required to modify the dynamical system to account for the stochastic gradient noise. In this paper, we provide a general recipe for constructing MCMC samplers--including stochastic gradient versions--based on continuous Markov processes specified via two matrices. We constructively prove that the framework is complete. That is, any continuous Markov process that provides samples from the target distribution can be written in our framework. We show how previous continuous-dynamic samplers can be trivially "reinvented" in our framework, avoiding the complicated sampler-specific proofs. We likewise use our recipe to straightforwardly propose a new state-adaptive sampler: stochastic gradient Riemann Hamiltonian Monte Carlo (SGRHMC). Our experiments on simulated data and a streaming Wikipedia analysis demonstrate that the proposed SGRHMC sampler inherits the benefits of Riemann HMC, with the scalability of stochastic gradient methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/02/2016

Asynchronous Stochastic Gradient MCMC with Elastic Coupling

We consider parallel asynchronous Markov Chain Monte Carlo (MCMC) sampli...
research
06/12/2018

Meta-Learning for Stochastic Gradient MCMC

Stochastic gradient Markov chain Monte Carlo (SG-MCMC) has become increa...
research
10/16/2015

Scalable MCMC for Mixed Membership Stochastic Blockmodels

We propose a stochastic gradient Markov chain Monte Carlo (SG-MCMC) algo...
research
02/20/2022

Interacting Contour Stochastic Gradient Langevin Dynamics

We propose an interacting contour stochastic gradient Langevin dynamics ...
research
10/03/2020

An adaptive Hessian approximated stochastic gradient MCMC method

Bayesian approaches have been successfully integrated into training deep...
research
10/28/2022

Preferential Subsampling for Stochastic Gradient Langevin Dynamics

Stochastic gradient MCMC (SGMCMC) offers a scalable alternative to tradi...
research
02/17/2014

Stochastic Gradient Hamiltonian Monte Carlo

Hamiltonian Monte Carlo (HMC) sampling methods provide a mechanism for d...

Please sign up or login with your details

Forgot password? Click here to reset