High-temperature Expansions and Message Passing Algorithms

06/20/2019
by   Antoine Maillard, et al.
0

Improved mean-field technics are a central theme of statistical physics methods applied to inference and learning. We revisit here some of these methods using high-temperature expansions for disordered systems initiated by Plefka, Georges and Yedidia. We derive the Gibbs free entropy and the subsequent self-consistent equations for a generic class of statistical models with correlated matrices and show in particular that many classical approximation schemes, such as adaptive TAP, Expectation-Consistency, or the approximations behind the Vector Approximate Message Passing algorithm all rely on the same assumptions, that are also at the heart of high-temperature expansions. We focus on the case of rotationally invariant random coupling matrices in the `high-dimensional' limit in which the number of samples and the dimension are both large, but with a fixed ratio. This encapsulates many widely studied models, such as Restricted Boltzmann Machines or Generalized Linear Models with correlated data matrices. In this general setting, we show that all the approximation schemes described before are equivalent, and we conjecture that they are exact in the thermodynamic limit in the replica symmetric phases. We achieve this conclusion by resummation of the infinite perturbation series, which generalizes a seminal result of Parisi and Potters. A rigorous derivation of this conjecture is an interesting mathematical challenge. On the way to these conclusions, we uncover several diagrammatical results in connection with free probability and random matrix theory, that are interesting independently of the rest of our work.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/06/2021

The replica-symmetric free energy for Ising spin glasses with orthogonally invariant couplings

We study the mean-field Ising spin glass model with external field, wher...
research
02/11/2020

Asymptotic errors for convex penalized linear regression beyond Gaussian matrices

We consider the problem of learning a coefficient vector x_0 in R^N from...
research
02/03/2020

Understanding the dynamics of message passing algorithms: a free probability heuristics

We use freeness assumptions of random matrix theory to analyze the dynam...
research
03/10/2021

Mean-field methods and algorithmic perspectives for high-dimensional machine learning

The main difficulty that arises in the analysis of most machine learning...
research
12/14/2021

The high-dimensional asymptotics of first order methods with random data

We study a class of deterministic flows in ℝ^d× k, parametrized by a ran...
research
08/12/2022

The planted XY model: thermodynamics and inference

In this paper we study a fully connected planted spin glass named the pl...
research
03/10/2022

Sampling from the Sherrington-Kirkpatrick Gibbs measure via algorithmic stochastic localization

We consider the Sherrington-Kirkpatrick model of spin glasses at high-te...

Please sign up or login with your details

Forgot password? Click here to reset