DeepAI AI Chat
Log In Sign Up

Quantitative convergence rates for reversible Markov chains via strong random times

08/18/2019
by   Daniel C. Jerison, et al.
0

Let (X_t) be a discrete time Markov chain on a general state space. It is well-known that if (X_t) is aperiodic and satisfies a drift and minorization condition, then it converges to its stationary distribution π at an exponential rate. We consider the problem of computing upper bounds for the distance from stationarity in terms of the drift and minorization data. Baxendale showed that these bounds improve significantly if one assumes that (X_t) is reversible with nonnegative eigenvalues (i.e. its transition kernel is a self-adjoint operator on L^2(π) with spectrum contained in [0,1]). We identify this phenomenon as a special case of a general principle: for a reversible chain with nonnegative eigenvalues, any strong random time gives direct control over the convergence rate. We formulate this principle precisely and deduce from it a stronger version of Baxendale's result. Our approach is fully quantitative and allows us to convert drift and minorization data into explicit convergence bounds. We show that these bounds are tighter than those of Rosenthal and Baxendale when applied to a well-studied example.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/14/2019

Drift, Minorization, and Hitting Times

The “drift-and-minorization” method, introduced and popularized in (Rose...
03/21/2020

On the limitations of single-step drift and minorization in Markov chain convergence analysis

Over the last three decades, there has been a considerable effort within...
12/09/2013

A Unified Markov Chain Approach to Analysing Randomised Search Heuristics

The convergence, convergence rate and expected hitting time play fundame...
12/24/2017

Asymptotically Stable Drift and Minorization for Markov Chains with Application to Albert and Chib's Algorithm

The use of MCMC algorithms in high dimensional Bayesian problems has bec...
12/14/2019

Mixing Time Estimation in Ergodic Markov Chains from a Single Trajectory with Contraction Methods

The mixing time t_mix of an ergodic Markov chain measures the rate of co...
04/22/2019

Convergence of diffusions and their discretizations: from continuous to discrete processes and back

In this paper, we establish new quantitative convergence bounds for a cl...
10/27/2021

The ODE Method for Asymptotic Statistics in Stochastic Approximation and Reinforcement Learning

The paper concerns convergence and asymptotic statistics for stochastic ...