Reversible Markov chains: variational representations and ordering

09/06/2018
by   Chris Sherlock, et al.
0

This pedagogical document explains three variational representations that are useful when comparing the efficiencies of reversible Markov chains: (i) the Dirichlet form and the associated variational representations of the spectral gaps; (ii) a variational representation of the asymptotic variance of an ergodic average; and (iii) the conductance, and the equivalence of a non-zero conductance to a non-zero right spectral gap.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/26/2022

Efficient shape-constrained inference for the autocovariance sequence from a reversible Markov chain

In this paper, we study the problem of estimating the autocovariance seq...
research
07/31/2023

Geometric ergodicity of trans-dimensional Markov chain Monte Carlo algorithms

This article studies the convergence properties of trans-dimensional MCM...
research
06/26/2021

Optimal prediction of Markov chains with and without spectral gap

We study the following learning problem with dependent data: Observing a...
research
06/10/2021

Information Geometry of Reversible Markov Chains

We analyze the information geometric structure of time reversibility for...
research
02/01/2019

Estimating the Mixing Time of Ergodic Markov Chains

We address the problem of estimating the mixing time t_mix of an arbitra...
research
11/24/2021

A Unified Approach to Variational Autoencoders and Stochastic Normalizing Flows via Markov Chains

Normalizing flows, diffusion normalizing flows and variational autoencod...
research
03/17/2014

A reversible infinite HMM using normalised random measures

We present a nonparametric prior over reversible Markov chains. We use c...

Please sign up or login with your details

Forgot password? Click here to reset