Auxiliary gradient-based sampling algorithms

10/30/2016
by   Michalis K. Titsias, et al.
0

We introduce a new family of MCMC samplers that combine auxiliary variables, Gibbs sampling and Taylor expansions of the target density. Our approach permits the marginalisation over the auxiliary variables yielding marginal samplers, or the augmentation of the auxiliary variables, yielding auxiliary samplers. The well-known Metropolis-adjusted Langevin algorithm (MALA) and preconditioned Crank-Nicolson Langevin (pCNL) algorithm are shown to be special cases. We prove that marginal samplers are superior in terms of asymptotic variance and demonstrate cases where they are slower in computing time compared to auxiliary samplers. In the context of latent Gaussian models we propose new auxiliary and marginal samplers whose implementation requires a single tuning parameter, which can be found automatically during the transient phase. Extensive experimentation shows that the increase in efficiency (measured as effective sample size per unit of computing time) relative to (optimised implementations of) pCNL, elliptical slice sampling and MALA ranges from 10-fold in binary classification problems to 25-fold in log-Gaussian Cox processes to 100-fold in Gaussian process regression, and it is on par with Riemann manifold Hamiltonian Monte Carlo in an example where the latter has the same complexity as the aforementioned algorithms. We explain this remarkable improvement in terms of the way alternative samplers try to approximate the eigenvalues of the target. We introduce a novel MCMC sampling scheme for hyperparameter learning that builds upon the auxiliary samplers. The MATLAB code for reproducing the experiments in the article is publicly available and a Supplement to this article contains additional experiments and implementation details.

READ FULL TEXT

page 29

page 35

research
03/01/2023

Auxiliary MCMC and particle Gibbs samplers for parallelisable inference in latent dynamical systems

We introduce two new classes of exact Markov chain Monte Carlo (MCMC) sa...
research
07/16/2014

A marginal sampler for σ-Stable Poisson-Kingman mixture models

We investigate the class of σ-stable Poisson-Kingman random probability ...
research
06/04/2013

Fast Gradient-Based Inference with Continuous Latent Variable Models in Auxiliary Form

We propose a technique for increasing the efficiency of gradient-based i...
research
05/17/2020

Hamiltonian Assisted Metropolis Sampling

Various Markov chain Monte Carlo (MCMC) methods are studied to improve u...
research
07/29/2022

Enhanced gradient-based MCMC in discrete spaces

The recent introduction of gradient-based MCMC for discrete spaces holds...
research
11/17/2015

Accelerating pseudo-marginal Metropolis-Hastings by correlating auxiliary variables

Pseudo-marginal Metropolis-Hastings (pmMH) is a powerful method for Baye...
research
07/09/2021

Fast compression of MCMC output

We propose cube thinning, a novel method for compressing the output of a...

Please sign up or login with your details

Forgot password? Click here to reset