DeepAI AI Chat
Log In Sign Up

Multi-layer State Evolution Under Random Convolutional Design

by   Max Daniels, et al.

Signal recovery under generative neural network priors has emerged as a promising direction in statistical inference and computational imaging. Theoretical analysis of reconstruction algorithms under generative priors is, however, challenging. For generative priors with fully connected layers and Gaussian i.i.d. weights, this was achieved by the multi-layer approximate message (ML-AMP) algorithm via a rigorous state evolution. However, practical generative priors are typically convolutional, allowing for computational benefits and inductive biases, and so the Gaussian i.i.d. weight assumption is very limiting. In this paper, we overcome this limitation and establish the state evolution of ML-AMP for random convolutional layers. We prove in particular that random convolutional layers belong to the same universality class as Gaussian matrices. Our proof technique is of an independent interest as it establishes a mapping between convolutional matrices and spatially coupled sensing matrices used in coding theory.


page 1

page 2

page 3

page 4


Approximate Message Passing for Multi-Layer Estimation in Rotationally Invariant Models

We consider the problem of reconstructing the signal and the hidden vari...

Inference with Deep Generative Priors in High Dimensions

Deep generative priors offer powerful models for complex-structured data...

Signal Recovery with Non-Expansive Generative Network Priors

We study compressive sensing with a deep generative network prior. Initi...

Asymptotics of MAP Inference in Deep Networks

Deep generative priors are a powerful tool for reconstruction problems w...

Inference in Deep Networks in High Dimensions

Deep generative networks provide a powerful tool for modeling complex da...

Orthogonal Approximate Message-Passing for Spatially Coupled Systems

Orthogonal approximate message-passing (OAMP) is proposed for signal rec...