Variance reduction for Markov chains with application to MCMC

10/08/2019
by   D. Belomestny, et al.
0

In this paper we propose a novel variance reduction approach for additive functionals of Markov chains based on minimization of an estimate for the asymptotic variance of these functionals over suitable classes of control variates. A distinctive feature of the proposed approach is its ability to significantly reduce the overall finite sample variance. This feature is theoretically demonstrated by means of a deep non asymptotic analysis of a variance reduced functional as well as by a thorough simulation study. In particular we apply our method to various MCMC Bayesian estimation problems where it favourably compares to the existing variance reduction approaches.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/03/2023

Theoretical guarantees for neural control variates in MCMC

In this paper, we propose a variance reduction approach for Markov chain...
research
08/16/2020

Variance reduction for dependent sequences with applications to Stochastic Gradient MCMC

In this paper we propose a novel and practical variance reduction approa...
research
03/18/2019

Variance reduction for MCMC methods via martingale representations

In this paper we propose an efficient variance reduction approach for MC...
research
12/14/2019

Control variates and Rao-Blackwellization for deterministic sweep Markov chains

We study control variate methods for Markov chain Monte Carlo (MCMC) in ...
research
12/13/2017

Variance reduction via empirical variance minimization: convergence and complexity

In this paper we propose and study a generic variance reduction approach...
research
01/31/2020

Semi-Exact Control Functionals From Sard's Method

This paper focuses on the numerical computation of posterior expected qu...
research
01/07/2020

Reanalysis of Variance Reduced Temporal Difference Learning

Temporal difference (TD) learning is a popular algorithm for policy eval...

Please sign up or login with your details

Forgot password? Click here to reset