Informed proposals for local MCMC in discrete spaces

11/20/2017
by   Giacomo Zanella, et al.
0

There is a lack of methodological results to design efficient Markov chain Monte Carlo (MCMC) algorithms for statistical models with discrete-valued high-dimensional parameters. Motivated by this consideration, we propose a simple framework for the design of informed MCMC proposals (i.e. Metropolis-Hastings proposal distributions that appropriately incorporate local information about the target) which is naturally applicable to both discrete and continuous spaces. We explicitly characterize the class of optimal proposal distributions under this framework, which we refer to as locally-balanced proposals, and prove their Peskun-optimality in high-dimensional regimes. The resulting algorithms are straightforward to implement in discrete spaces and provide orders of magnitude improvements in efficiency compared to alternative MCMC schemes, including discrete versions of Hamiltonian Monte Carlo. Simulations are performed with both simulated and real datasets, including a detailed application to Bayesian record linkage. A direct connection with gradient-based MCMC suggests that locally-balanced proposals may be seen as a natural way to extend the latter to discrete spaces.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset