A Hybrid Alternative to Gibbs Sampling for Bayesian Latent Variable Models

08/27/2018
by   Grant Backlund, et al.
0

Gibbs sampling is a widely popular Markov chain Monte Carlo algorithm which is often used to analyze intractable posterior distributions associated with Bayesian hierarchical models. The goal of this article is to introduce an alternative to Gibbs sampling that is particularly well suited for Bayesian models which contain latent or missing data. The basic idea of this hybrid algorithm is to update the latent data from its full conditional distribution at every iteration, and then use a random scan to update the parameters of interest. The hybrid algorithm is often easier to analyze from a theoretical standpoint than the deterministic or random scan Gibbs sampler. We highlight a positive result in this direction from Abrahamsen and Hobert (2018), who proved geometric ergodicity of the hybrid algorithm for a Bayesian version of the general linear mixed model with a continuous shrinkage prior. The convergence rate of the Gibbs sampler for this model remains unknown. In addition, we provide new geometric ergodicity results for the hybrid algorithm and the Gibbs sampler for two classes of Bayesian linear regression models with non-Gaussian errors. In both cases, the conditions under which the hybrid algorithm is geometric are much weaker than the corresponding conditions for the Gibbs sampler. Finally, we show that the hybrid algorithm is amenable to a modified version of the sandwich methodology of Hobert and Marchev (2008), which can be used to speed up the convergence rate of the underlying Markov chain while requiring roughly the same computational effort per iteration.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/02/2021

Geometric ergodicity of Gibbs samplers for the Horseshoe and its regularized variants

The Horseshoe is a widely used and popular continuous shrinkage prior fo...
research
09/17/2023

Gibbs Sampling using Anti-correlation Gaussian Data Augmentation, with Applications to L1-ball-type Models

L1-ball-type priors are a recent generalization of the spike-and-slab pr...
research
08/14/2018

Simulating Markov random fields with a conclique-based Gibbs sampler

For spatial and network data, we consider models formed from a Markov ra...
research
07/18/2017

Improving Gibbs Sampler Scan Quality with DoGS

The pairwise influence matrix of Dobrushin has long been used as an anal...
research
09/17/2022

Geometric ergodicity of Gibbs samplers for Bayesian error-in-variable regression

We consider Bayesian error-in-variable (EIV) linear regression accountin...
research
01/28/2018

Adapting The Gibbs Sampler

The popularity of Adaptive MCMC has been fueled on the one hand by its s...
research
05/29/2019

Bayesian Dynamic Fused LASSO

The new class of Markov processes is proposed to realize the flexible sh...

Please sign up or login with your details

Forgot password? Click here to reset