Various proofs of the Fundamental Theorem of Markov Chains

04/02/2022
by   Somenath Biswas, et al.
0

This paper is a survey of various proofs of the so called fundamental theorem of Markov chains: every ergodic Markov chain has a unique positive stationary distribution and the chain attains this distribution in the limit independent of the initial distribution the chain started with. As Markov chains are stochastic processes, it is natural to use probability based arguments for proofs. At the same time, the dynamics of a Markov chain is completely captured by its initial distribution, which is a vector, and its transition probability matrix. Therefore, arguments based on matrix analysis and linear algebra can also be used. The proofs discussed below use one or the other of these two types of arguments, except in one case where the argument is graph theoretic. Appropriate credits to the various proofs are given in the main text. Our first proof is entirely elementary, and yet the proof is also quite simple. The proof also suggests a mixing time bound, which we prove, but this bound in many cases will not be the best bound. One approach in proving the fundamental theorem breaks the proof in two parts: (i) show the existence of a unique positive stationary distribution for irreducible Markov chains, and (ii) assuming that an ergodic chain does have a stationary distribution, show that the chain will converge in the limit to that distribution irrespective of the initial distribution. For (i), we survey two proofs, one uses probability arguments, and the other uses graph theoretic arguments. For (ii), first we give a coupling based proof (coupling is a probability based technique), the other uses matrix analysis. Finally, we give a proof of the fundamental theorem using only linear algebra concepts.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/14/2020

Central Limit Theorem and Moderate deviation for nonhomogenenous Markov chains

Our purpose is to prove central limit theorem for countable nonhomogeneo...
research
02/04/2018

Toward a Theory of Markov Influence Systems and their Renormalization

Nonlinear Markov chains are probabilistic models commonly used in physic...
research
03/08/2022

Localization Schemes: A Framework for Proving Mixing Bounds for Markov Chains

Two recent and seemingly-unrelated techniques for proving mixing bounds ...
research
07/10/2019

Optimal Chernoff and Hoeffding Bounds for Finite Markov Chains

This paper develops an optimal Chernoff type bound for the probabilities...
research
07/05/2019

A quantitative Mc Diarmid's inequality for geometrically ergodic Markov chains

We state and prove a quantitative version of the bounded difference ineq...
research
12/29/2020

Cutoff profile of ASEP on a segment

This paper studies the mixing behavior of the Asymmetric Simple Exclusio...
research
12/07/2021

Convergence rate bounds for iterative random functions using one-shot coupling

One-shot coupling is a method of bounding the convergence rate between t...

Please sign up or login with your details

Forgot password? Click here to reset