Marvels and Pitfalls of the Langevin Algorithm in Noisy High-dimensional Inference

by   Stefano Sarao Mannelli, et al.

Gradient-descent-based algorithms and their stochastic versions have widespread applications in machine learning and statistical inference. In this work we perform an analytic study of the performances of one of them, the Langevin algorithm, in the context of noisy high-dimensional inference. We employ the Langevin algorithm to sample the posterior probability measure for the spiked matrix-tensor model. The typical behaviour of this algorithm is described by a system of integro-differential equations that we call the Langevin state evolution, whose solution is compared with the one of the state evolution of approximate message passing (AMP). Our results show that, remarkably, the algorithmic threshold of the Langevin algorithm is sub-optimal with respect to the one given by AMP. We conjecture this phenomenon to be due to the residual glassiness present in that region of parameters. Finally we show how a landscape-annealing protocol, that uses the Langevin algorithm but violate the Bayes-optimality condition, can approach the performance of AMP.


page 5

page 24

page 25

page 41


Passed & Spurious: analysing descent algorithms and local minima in spiked matrix-tensor model

In this work we analyse quantitatively the interplay between the loss la...

Approximate Message Passing with Unitary Transformation for Robust Bilinear Recovery

Recently, several promising approximate message passing (AMP) based algo...

TRAMP: Compositional Inference with TRee Approximate Message Passing

We introduce tramp, standing for TRee Approximate Message Passing, a pyt...

Graph-based Approximate Message Passing Iterations

Approximate-message passing (AMP) algorithms have become an important el...

Sufficient Statistic Memory AMP

Approximate message passing (AMP) is a promising technique for unknown s...

Please sign up or login with your details

Forgot password? Click here to reset