Importance Sampling Methods for Bayesian Inference with Partitioned Data

10/12/2022
by   Marc Box, et al.
0

This article presents new methodology for sample-based Bayesian inference when data are partitioned and communication between the parts is expensive, as arises by necessity in the context of "big data" or by choice in order to take advantage of computational parallelism. The method, which we call the Laplace enriched multiple importance estimator, uses new multiple importance sampling techniques to approximate posterior expectations using samples drawn independently from the local posterior distributions (those conditioned on isolated parts of the data). We construct Laplace approximations from which additional samples can be drawn relatively quickly and improve the methods in high-dimensional estimation. The methods are "embarrassingly parallel", make no restriction on the sampling algorithm (including MCMC) to use or choice of prior distribution, and do not rely on any assumptions about the posterior such as normality. The performance of the methods is demonstrated and compared against some alternatives in experiments with simulated data.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro