Instance-Optimal Compressed Sensing via Posterior Sampling

06/21/2021
by   Ajil Jalal, et al.
6

We characterize the measurement complexity of compressed sensing of signals drawn from a known prior distribution, even when the support of the prior is the entire space (rather than, say, sparse vectors). We show for Gaussian measurements and any prior distribution on the signal, that the posterior sampling estimator achieves near-optimal recovery guarantees. Moreover, this result is robust to model mismatch, as long as the distribution estimate (e.g., from an invertible generative model) is close to the true distribution in Wasserstein distance. We implement the posterior sampling estimator for deep generative priors using Langevin dynamics, and empirically find that it produces accurate estimates with more diversity than MAP.

READ FULL TEXT

page 2

page 3

page 8

page 9

08/03/2021

Robust Compressed Sensing MRI with Deep Generative Priors

The CSGM framework (Bora-Jalal-Price-Dimakis'17) has shown that deep gen...
02/25/2021

Provable Compressed Sensing with Generative Priors via Langevin Dynamics

Deep generative models have emerged as a powerful class of priors for si...
07/04/2018

Modeling Sparse Deviations for Compressed Sensing using Generative Models

In compressed sensing, a small number of linear measurements can be used...
09/01/2015

Sequential Information Guided Sensing

We study the value of information in sequential compressed sensing by ch...
01/03/2018

Theoretical links between universal and Bayesian compressed sensing algorithms

Quantized maximum a posteriori (Q-MAP) is a recently-proposed Bayesian c...
06/07/2020

Constant-Expansion Suffices for Compressed Sensing with Generative Priors

Generative neural networks have been empirically found very promising in...
12/19/2018

Derandomizing compressed sensing with combinatorial design

Compressed sensing is the art of reconstructing structured n-dimensional...

Code Repositories