Log In Sign Up

Accelerating MCMC algorithms through Bayesian Deep Networks

by   Hector J. Hortua, et al.

Markov Chain Monte Carlo (MCMC) algorithms are commonly used for their versatility in sampling from complicated probability distributions. However, as the dimension of the distribution gets larger, the computational costs for a satisfactory exploration of the sampling space become challenging. Adaptive MCMC methods employing a choice of proposal distribution can address this issue speeding up the convergence. In this paper we show an alternative way of performing adaptive MCMC, by using the outcome of Bayesian Neural Networks as the initial proposal for the Markov Chain. This combined approach increases the acceptance rate in the Metropolis-Hasting algorithm and accelerate the convergence of the MCMC while reaching the same final accuracy. Finally, we demonstrate the main advantages of this approach by constraining the cosmological parameters directly from Cosmic Microwave Background maps.


page 1

page 2

page 3

page 4


Data analysis recipes: Using Markov Chain Monte Carlo

Markov Chain Monte Carlo (MCMC) methods for sampling probability density...

Plateau Proposal Distributions for Adaptive Component-wise Multiple-Try Metropolis

Markov chain Monte Carlo (MCMC) methods are sampling methods that have b...

Multiple projection MCMC algorithms on submanifolds

We propose new Markov Chain Monte Carlo algorithms to sample probability...

Adaptation of the Independent Metropolis-Hastings Sampler with Normalizing Flow Proposals

Markov Chain Monte Carlo (MCMC) methods are a powerful tool for computat...

Informed Proposal Monte Carlo

Any search or sampling algorithm for solution of inverse problems needs ...

Decayed MCMC Filtering

Filtering---estimating the state of a partially observable Markov proces...