Multilevel Monte Carlo for Scalable Bayesian Computations

09/15/2016
by   Mike Giles, et al.
0

Markov chain Monte Carlo (MCMC) algorithms are ubiquitous in Bayesian computations. However, they need to access the full data set in order to evaluate the posterior density at every step of the algorithm. This results in a great computational burden in big data applications. In contrast to MCMC methods, Stochastic Gradient MCMC (SGMCMC) algorithms such as the Stochastic Gradient Langevin Dynamics (SGLD) only require access to a batch of the data set at every step. This drastically improves the computational performance and scales well to large data sets. However, the difficulty with SGMCMC algorithms comes from the sensitivity to its parameters which are notoriously difficult to tune. Moreover, the Root Mean Square Error (RMSE) scales as O(c^-1/3) as opposed to standard MCMC O(c^-1/2) where c is the computational cost. We introduce a new class of Multilevel Stochastic Gradient Markov chain Monte Carlo algorithms that are able to mitigate the problem of tuning the step size and more importantly of recovering the O(c^-1/2) convergence of standard Markov Chain Monte Carlo methods without the need to introduce Metropolis-Hasting steps. A further advantage of this new class of algorithms is that it can easily be parallelised over a heterogeneous computer architecture. We illustrate our methodology using Bayesian logistic regression and provide numerical evidence that for a prescribed relative RMSE the computational cost is sublinear in the number of data items.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/16/2019

Stochastic gradient Markov chain Monte Carlo

Markov chain Monte Carlo (MCMC) algorithms are generally regarded as the...
research
05/04/2016

Multi Level Monte Carlo methods for a class of ergodic stochastic differential equations

We develop a framework that allows the use of the multi-level Monte Carl...
research
02/07/2020

Extended Stochastic Gradient MCMC for Large-Scale Bayesian Variable Selection

Stochastic gradient Markov chain Monte Carlo (MCMC) algorithms have rece...
research
02/07/2020

Explicit Mean-Square Error Bounds for Monte-Carlo and Linear Stochastic Approximation

This paper concerns error bounds for recursive equations subject to Mark...
research
01/28/2019

Scalable Metropolis-Hastings for Exact Bayesian Inference with Large Datasets

Bayesian inference via standard Markov Chain Monte Carlo (MCMC) methods ...
research
12/16/2017

Parallel Markov Chain Monte Carlo for Bayesian Hierarchical Models with Big Data, in Two Stages

Due to the escalating growth of big data sets in recent years, new paral...
research
05/27/2021

Stochastic Gradient MCMC with Multi-Armed Bandit Tuning

Stochastic gradient Markov chain Monte Carlo (SGMCMC) is a popular class...

Please sign up or login with your details

Forgot password? Click here to reset