DeepAI AI Chat
Log In Sign Up

Metastable Mixing of Markov Chains: Efficiently Sampling Low Temperature Exponential Random Graphs

by   Guy Bresler, et al.

In this paper we consider the problem of sampling from the low-temperature exponential random graph model (ERGM). The usual approach is via Markov chain Monte Carlo, but Bhamidi et al. showed that any local Markov chain suffers from an exponentially large mixing time due to metastable states. We instead consider metastable mixing, a notion of approximate mixing relative to the stationary distribution, for which it turns out to suffice to mix only within a collection of metastable states. We show that the Glauber dynamics for the ERGM at any temperature – except at a lower-dimensional critical set of parameters – when initialized at G(n,p) for the right choice of p has a metastable mixing time of O(n^2log n) to within total variation distance exp(-Ω(n)).


page 1

page 2

page 3

page 4


Sampling from the low temperature Potts model through a Markov chain on flows

In this paper we consider the algorithmic problem of sampling from the P...

On approximating the stationary distribution of time-reversible Markov chains

Approximating the stationary probability of a state in a Markov chain th...

The Critical Mean-field Chayes-Machta Dynamics

The random-cluster model is a unifying framework for studying random gra...

Geometric Bounds on the Fastest Mixing Markov Chain

In the Fastest Mixing Markov Chain problem, we are given a graph G = (V,...

Restart perturbations for lazy, reversible Markov chains: trichotomy and pre-cutoff equivalence

Given a lazy, reversible Markov chain with n states and transition matri...

A Mixing Time Lower Bound for a Simplified Version of BART

Bayesian Additive Regression Trees (BART) is a popular Bayesian non-para...