Constant-Expansion Suffices for Compressed Sensing with Generative Priors

06/07/2020
by   Constantinos Daskalakis, et al.
0

Generative neural networks have been empirically found very promising in providing effective structural priors for compressed sensing, since they can be trained to span low-dimensional data manifolds in high-dimensional signal spaces. Despite the non-convexity of the resulting optimization problem, it has also been shown theoretically that, for neural networks with random Gaussian weights, a signal in the range of the network can be efficiently, approximately recovered from a few noisy measurements. However, a major bottleneck of these theoretical guarantees is a network expansivity condition: that each layer of the neural network must be larger than the previous by a logarithmic factor. Our main contribution is to break this strong expansivity assumption, showing that constant expansivity suffices to get efficient recovery algorithms, besides it also being information-theoretically necessary. To overcome the theoretical bottleneck in existing approaches we prove a novel uniform concentration theorem for random functions that might not be Lipschitz but satisfy a relaxed notion which we call "pseudo-Lipschitzness." Using this theorem we can show that a matrix concentration inequality known as the Weight Distribution Condition (WDC), which was previously only known to hold for Gaussian matrices with logarithmic aspect ratio, in fact holds for constant aspect ratios too. Since the WDC is a fundamental matrix concentration inequality in the heart of all existing theoretical guarantees on this problem, our tighter bound immediately yields improvements in all known results in the literature on compressed sensing with deep generative priors, including one-bit recovery, phase retrieval, and more.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/24/2022

Signal Recovery with Non-Expansive Generative Network Priors

We study compressive sensing with a deep generative network prior. Initi...
research
12/04/2019

Exact asymptotics for phase retrieval and compressed sensing with random generative priors

We consider the problem of compressed sensing and of (real-valued) phase...
research
07/19/2022

A coherence parameter characterizing generative compressed sensing with Fourier measurements

In Bora et al. (2017), a mathematical framework was developed for compre...
research
03/03/2022

Uniform Approximations for Randomized Hadamard Transforms with Applications

Randomized Hadamard Transforms (RHTs) have emerged as a computationally ...
research
06/16/2020

Robust compressed sensing of generative models

The goal of compressed sensing is to estimate a high dimensional vector ...
research
09/02/2020

Thermal Source Localization Through Infinite-Dimensional Compressed Sensing

We propose a scheme utilizing ideas from infinite dimensional compressed...
research
10/13/2020

Deep generative demixing: Recovering Lipschitz signals from noisy subgaussian mixtures

Generative neural networks (GNNs) have gained renown for efficaciously c...

Please sign up or login with your details

Forgot password? Click here to reset