Sampling Approximately Low-Rank Ising Models: MCMC meets Variational Methods

02/17/2022
by   Frederic Koehler, et al.
0

We consider Ising models on the hypercube with a general interaction matrix J, and give a polynomial time sampling algorithm when all but O(1) eigenvalues of J lie in an interval of length one, a situation which occurs in many models of interest. This was previously known for the Glauber dynamics when *all* eigenvalues fit in an interval of length one; however, a single outlier can force the Glauber dynamics to mix torpidly. Our general result implies the first polynomial time sampling algorithms for low-rank Ising models such as Hopfield networks with a fixed number of patterns and Bayesian clustering models with low-dimensional contexts, and greatly improves the polynomial time sampling regime for the antiferromagnetic/ferromagnetic Ising model with inconsistent field on expander graphs. It also improves on previous approximation algorithm results based on the naive mean-field approximation in variational methods and statistical physics. Our approach is based on a new fusion of ideas from the MCMC and variational inference worlds. As part of our algorithm, we define a new nonconvex variational problem which allows us to sample from an exponential reweighting of a distribution by a negative definite quadratic form, and show how to make this procedure provably efficient using stochastic gradient descent. On top of this, we construct a new simulated tempering chain (on an extended state space arising from the Hubbard-Stratonovich transform) which overcomes the obstacle posed by large positive eigenvalues, and combine it with the SGD-based sampler to solve the full problem.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/13/2017

Stochastic Gradient Descent as Approximate Bayesian Inference

Stochastic Gradient Descent with a constant learning rate (constant SGD)...
research
02/07/2020

The k-tied Normal Distribution: A Compact Parameterization of Gaussian Mean Field Posteriors in Bayesian Neural Networks

Variational Bayesian Inference is a popular methodology for approximatin...
research
08/18/2023

Counting and Sampling Labeled Chordal Graphs in Polynomial Time

We present the first polynomial-time algorithm to exactly compute the nu...
research
11/10/2020

A Polynomial-Time Algorithm and Applications for Matrix Sampling from Harish-Chandra–Itzykson-Zuber Densities

Given two n × n Hermitian matrices Y and Λ, the Harish-Chandra–Itzykson–...
research
07/24/2023

Optimality of Glauber dynamics for general-purpose Ising model sampling and free energy approximation

Recently, Eldan, Koehler, and Zeitouni (2020) showed that Glauber dynami...
research
02/13/2020

Fast Convergence for Langevin Diffusion with Matrix Manifold Structure

In this paper, we study the problem of sampling from distributions of th...
research
08/10/2022

Computing theta function

Let f: R^n ⟶ R be a positive definite quadratic form and let y ∈ R^n be...

Please sign up or login with your details

Forgot password? Click here to reset