MCMC Confidence Intervals and Biases

12/04/2020 ∙ by Yu Hang Jiang, et al. ∙ 0

The recent paper "Simple confidence intervals for MCMC without CLTs" by J.S. Rosenthal, derived a simple MCMC confidence interval which does not require a CLT, using only Chebyshev's inequality, that result required certain assumptions about how the estimator bias and variance grow with the number of iterations n, in particular that the bias is o(1/√(n)). This assumption seemed mild, since it is generally believed that the estimator bias will be O(1/n) and hence o(1/√(n)). However, questions were raised about how to verify this assumption, and indeed we show that it might not always hold. Therefore, in this paper, we seek to simplify and weaken the assumptions in the previous mentioned paper, to make MCMC confidence intervals without CLTs more widely applicable.



There are no comments yet.


page 1

page 2

page 3

page 4

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.