Approximate Message Passing algorithms for rotationally invariant matrices

by   Zhou Fan, et al.

Approximate Message Passing (AMP) algorithms have seen widespread use across a variety of applications. However, the precise forms for their Onsager corrections and state evolutions depend on properties of the underlying random matrix ensemble, limiting the extent to which AMP algorithms derived for white noise may be applicable to data matrices that arise in practice. In this work, we study more general AMP algorithms for random matrices W that satisfy orthogonal rotational invariance in law, where W may have a spectral distribution that is different from the semicircle and Marcenko-Pastur laws characteristic of white noise. The Onsager corrections and state evolutions in these algorithms are defined by the free cumulants or rectangular free cumulants of the spectral distribution of W. Their forms were derived previously by Opper, Çakmak, and Winther using non-rigorous dynamic functional theory techniques, and we provide rigorous proofs. Our motivating application is a Bayes-AMP algorithm for Principal Components Analysis, when there is prior structure for the principal components (PCs) and possibly non-white noise. For sufficiently large signal strengths and any non-Gaussian prior distributions for the PCs, we show that this algorithm provably achieves higher estimation accuracy than the sample PCs.



page 1

page 2

page 3

page 4


Approximate Message Passing for orthogonally invariant ensembles: Multivariate non-linearities and spectral initialization

We study a class of Approximate Message Passing (AMP) algorithms for sym...

Estimation in Rotationally Invariant Generalized Linear Models via Approximate Message Passing

We consider the problem of signal estimation in generalized linear model...

Universality of Approximate Message Passing Algorithms

We consider a broad class of Approximate Message Passing (AMP) algorithm...

Empirical Bayes PCA in high dimensions

When the dimension of data is comparable to or larger than the number of...

Additivity of Information in Multilayer Networks via Additive Gaussian Noise Transforms

Multilayer (or deep) networks are powerful probabilistic models based on...

Expectation Consistent Plug-and-Play for MRI

For image recovery problems, plug-and-play (PnP) methods have been devel...

Analysis of Approximate Message Passing with Non-Separable Denoisers and Markov Random Field Priors

Approximate message passing (AMP) is a class of low-complexity, scalable...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.