Importance Sampling Methods for Bayesian Inference with Partitioned Data

10/12/2022
by   Marc Box, et al.
0

This article presents new methodology for sample-based Bayesian inference when data are partitioned and communication between the parts is expensive, as arises by necessity in the context of "big data" or by choice in order to take advantage of computational parallelism. The method, which we call the Laplace enriched multiple importance estimator, uses new multiple importance sampling techniques to approximate posterior expectations using samples drawn independently from the local posterior distributions (those conditioned on isolated parts of the data). We construct Laplace approximations from which additional samples can be drawn relatively quickly and improve the methods in high-dimensional estimation. The methods are "embarrassingly parallel", make no restriction on the sampling algorithm (including MCMC) to use or choice of prior distribution, and do not rely on any assumptions about the posterior such as normality. The performance of the methods is demonstrated and compared against some alternatives in experiments with simulated data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/08/2019

Distilling importance sampling

The two main approaches to Bayesian inference are sampling and optimisat...
research
03/11/2019

Embarrassingly parallel MCMC using deep invertible transformations

While MCMC methods have become a main work-horse for Bayesian inference,...
research
06/30/2021

Variational Refinement for Importance Sampling Using the Forward Kullback-Leibler Divergence

Variational Inference (VI) is a popular alternative to asymptotically ex...
research
12/16/2014

Expectation propagation as a way of life: A framework for Bayesian inference on partitioned data

A common approach for Bayesian computation with big data is to partition...
research
05/23/2019

Gaussbock: Fast parallel-iterative cosmological parameter estimation with Bayesian nonparametrics

We present and apply Gaussbock, a new embarrassingly parallel iterative ...
research
07/22/2021

On the Scalability of Informed Importance Tempering

Informed MCMC methods have been proposed as scalable solutions to Bayesi...
research
04/14/2022

A new avenue for Bayesian inference with INLA

Integrated Nested Laplace Approximations (INLA) has been a successful ap...

Please sign up or login with your details

Forgot password? Click here to reset