On the Scalability of Informed Importance Tempering

07/22/2021
by   Quan Zhou, et al.
0

Informed MCMC methods have been proposed as scalable solutions to Bayesian posterior computation on high-dimensional discrete state spaces. We study a class of MCMC schemes called informed importance tempering (IIT), which combine importance sampling and informed local proposals. Spectral gap bounds for IIT estimators are obtained, which demonstrate the remarkable scalability of IIT samplers for unimodal target distributions. The theoretical insights acquired in this note provide guidance on the choice of informed proposals in model selection and the use of importance sampling in MCMC methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/20/2017

Informed proposals for local MCMC in discrete spaces

There is a lack of methodological results to design efficient Markov cha...
research
06/26/2018

New Estimation Approaches for the Linear Ballistic Accumulator Model

The Linear Ballistic Accumulator (LBA) model of Brown (2008) is used as ...
research
10/17/2016

Black-box Importance Sampling

Importance sampling is widely used in machine learning and statistics, b...
research
03/11/2019

Embarrassingly parallel MCMC using deep invertible transformations

While MCMC methods have become a main work-horse for Bayesian inference,...
research
10/12/2022

Importance Sampling Methods for Bayesian Inference with Partitioned Data

This article presents new methodology for sample-based Bayesian inferenc...
research
10/28/2019

Multilevel Dimension-Independent Likelihood-Informed MCMC for Large-Scale Inverse Problems

We present a non-trivial integration of dimension-independent likelihood...
research
11/21/2019

Parallelising MCMC via Random Forests

For Bayesian computation in big data contexts, the divide-and-conquer MC...

Please sign up or login with your details

Forgot password? Click here to reset