DeepAI AI Chat
Log In Sign Up

A step further towards automatic and efficient reversible jump algorithms

11/05/2019
by   Philippe Gagnon, et al.
0

Incorporating information about the target distribution in proposal mechanisms generally increases the efficiency of Markov chain Monte Carlo algorithms, comparatively to those based on naive random walks. Hamiltonian Monte Carlo is a successful example of fixed-dimensional algorithms incorporating gradient information. In trans-dimensional algorithms, Green (2003) recommended to generate the parameter proposals during model switches from normal distributions with informative means and covariance matrices. These proposal distributions can be viewed as approximating the limiting parameter distributions, where the limit is with regard to the sample size. Models are typically proposed naively. In this paper, we build on the approach of Zanella (2019) for discrete spaces to incorporate information about neighbouring models. More specifically, we rely on approximations to posterior model probabilities that are asymptotically exact, as the sample size increases. We prove that, as expected, the samplers combining this approach with that of Green (2003) behave like those able to generate from both the model distribution and parameter distributions in the large sample regime. We also prove that the proposed strategy is optimal when the posterior model probabilities concentrate. We review generic methods improving parameter proposals when the sample size is not large enough. We show how we can leverage these methods to improve model proposals as well. The methodology is applied to a real-data example. Detailed guidelines to fully automate the methodology implementation are provided. The code is available online.

READ FULL TEXT

page 3

page 13

10/11/2021

Reversible Genetically Modified Mode Jumping MCMC

In this paper, we introduce a reversible version of a genetically modifi...
11/20/2017

Informed proposals for local MCMC in discrete spaces

There is a lack of methodological results to design efficient Markov cha...
11/11/2021

Haar-Weave-Metropolis kernel

Recently, many Markov chain Monte Carlo methods have been developed with...
01/18/2022

Online, Informative MCMC Thinning with Kernelized Stein Discrepancy

A fundamental challenge in Bayesian inference is efficient representatio...
02/27/2023

Efficient Informed Proposals for Discrete Distributions via Newton's Series Approximation

Gradients have been exploited in proposal distributions to accelerate th...
10/01/2021

Delayed rejection Hamiltonian Monte Carlo for sampling multiscale distributions

The efficiency of Hamiltonian Monte Carlo (HMC) can suffer when sampling...
06/12/2013

Flexible sampling of discrete data correlations without the marginal distributions

Learning the joint dependence of discrete variables is a fundamental pro...