Unbiased Multilevel Monte Carlo methods for intractable distributions: MLMC meets MCMC
Constructing unbiased estimators from Markov chain Monte Carlo (MCMC) outputs has recently increased much attention in statistics and machine learning communities. However, the existing unbiased MCMC framework only works when the quantity of interest is an expectation, which rules out many practical applications. In this paper, we propose a general method to construct unbiased estimators for function of expectations. We further generalize this method to estimate nested expectations. Our idea is based on the combination and generalization of the unbiased MCMC and Multilevel Monte Carlo (MLMC) methods. In contrast to traditional sequential methods, our estimator can be easily implemented on parallel processors independently. We prove our estimator has a finite variance, a finite computational complexity, and achieves ε-accuracy within O(1/ε^2) computational cost under mild conditions. We also illustrate our estimator on both synthetic and real data examples.
READ FULL TEXT