Variational consensus Monte Carlo

06/09/2015
by   Maxim Rabinovich, et al.
0

Practitioners of Bayesian statistics have long depended on Markov chain Monte Carlo (MCMC) to obtain samples from intractable posterior distributions. Unfortunately, MCMC algorithms are typically serial, and do not scale to the large datasets typical of modern machine learning. The recently proposed consensus Monte Carlo algorithm removes this limitation by partitioning the data and drawing samples conditional on each partition in parallel (Scott et al, 2013). A fixed aggregation function then combines these samples, yielding approximate posterior samples. We introduce variational consensus Monte Carlo (VCMC), a variational Bayes algorithm that optimizes over aggregation functions to obtain samples from a distribution that better approximates the target. The resulting objective contains an intractable entropy term; we therefore derive a relaxation of the objective and show that the relaxed problem is blockwise concave under mild conditions. We illustrate the advantages of our algorithm on three inference tasks from the literature, demonstrating both the superior quality of the posterior approximation and the moderate overhead of the optimization step. Our algorithm achieves a relative error reduction (measured against serial MCMC) of up to 39 of estimating 300-dimensional probit regression parameter expectations; similarly, it achieves an error reduction of 92 cluster comembership probabilities in a Gaussian mixture model with 8 components in 8 dimensions. Furthermore, these gains come at moderate cost compared to the runtime of serial MCMC, achieving near-ideal speedup in some instances.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/07/2023

Variational Inference for Neyman-Scott Processes

Neyman-Scott processes (NSPs) have been applied across a range of fields...
research
03/09/2021

Unbiased approximation of posteriors via coupled particle Markov chain Monte Carlo

Markov chain Monte Carlo (MCMC) is a powerful methodology for the approx...
research
03/02/2019

Approximation Properties of Variational Bayes for Vector Autoregressions

Variational Bayes (VB) is a recent approximate method for Bayesian infer...
research
10/11/2017

Efficient MCMC for Gibbs Random Fields using pre-computation

Bayesian inference of Gibbs random fields (GRFs) is often referred to as...
research
03/13/2019

A Multi-armed Bandit MCMC, with applications in sampling from doubly intractable posterior

Markov chain Monte Carlo (MCMC) algorithms are widely used to sample fro...
research
10/11/2018

Stochastic Approximation Hamiltonian Monte Carlo

Recently, the Hamilton Monte Carlo (HMC) has become widespread as one of...
research
06/28/2022

Reconstructing the Universe with Variational self-Boosted Sampling

Forward modeling approaches in cosmology have made it possible to recons...

Please sign up or login with your details

Forgot password? Click here to reset