Complexity Results for MCMC derived from Quantitative Bounds

08/02/2017 ∙ by Jun Yang, et al. ∙ 0

This paper considers whether MCMC quantitative convergence bounds can be translated into complexity bounds. We prove that a certain realistic Gibbs sampler algorithm converges in constant number of iterations. Our proof uses a new general method of establishing a generalized geometric drift condition defined in a subset of the state space. The subset is chosen to rule out some "bad" states which have poor drift property when the dimension gets large. Using the new general method, the obtained quantitative bounds for the Gibbs sampler algorithm can be translated to tight complexity bounds in high-dimensional setting. It is our hope that the new general approach can be employed in many other specific examples to obtain complexity bounds for high-dimensional Markov chains.



There are no comments yet.


page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.