Asymptotically Stable Drift and Minorization for Markov Chains with Application to Albert and Chib's Algorithm

12/24/2017
by   Qian Qin, et al.
0

The use of MCMC algorithms in high dimensional Bayesian problems has become routine. This has spurred so-called convergence complexity analysis, the goal of which is to ascertain how the convergence rate of a Monte Carlo Markov chain scales with sample size, n, and/or number of covariates, p. Recent work suggests that, while often useful for establishing convergence rates when n and p are fixed, techniques based on drift and minorization may not be versatile enough to handle the more delicate task of convergence complexity analysis. This article provides some general ideas for sharpening drift and minorization conditions, particularly in cases where n and/or p are large. The key ideas include developing an appropriately "centered" drift function, and suppressing high-dimensionality in the construction of minorization conditions. These concepts are employed in a thorough convergence complexity analysis of Albert and Chib's (1993) data augmentation algorithm for the Bayesian probit model. The main result is that the geometric convergence rate of the underlying Markov chain is bounded below 1 both as n →∞ (with p fixed), and as p →∞ (with n fixed). Furthermore, the first computable bounds on the total variation distance to stationarity are byproducts of the asymptotic analysis.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset