Instance-Optimal Compressed Sensing via Posterior Sampling

by   Ajil Jalal, et al.

We characterize the measurement complexity of compressed sensing of signals drawn from a known prior distribution, even when the support of the prior is the entire space (rather than, say, sparse vectors). We show for Gaussian measurements and any prior distribution on the signal, that the posterior sampling estimator achieves near-optimal recovery guarantees. Moreover, this result is robust to model mismatch, as long as the distribution estimate (e.g., from an invertible generative model) is close to the true distribution in Wasserstein distance. We implement the posterior sampling estimator for deep generative priors using Langevin dynamics, and empirically find that it produces accurate estimates with more diversity than MAP.


page 2

page 3

page 8

page 9


Robust Compressed Sensing MRI with Deep Generative Priors

The CSGM framework (Bora-Jalal-Price-Dimakis'17) has shown that deep gen...

Provable Compressed Sensing with Generative Priors via Langevin Dynamics

Deep generative models have emerged as a powerful class of priors for si...

Modeling Sparse Deviations for Compressed Sensing using Generative Models

In compressed sensing, a small number of linear measurements can be used...

Sequential Information Guided Sensing

We study the value of information in sequential compressed sensing by ch...

Theoretical links between universal and Bayesian compressed sensing algorithms

Quantized maximum a posteriori (Q-MAP) is a recently-proposed Bayesian c...

Constant-Expansion Suffices for Compressed Sensing with Generative Priors

Generative neural networks have been empirically found very promising in...

Derandomizing compressed sensing with combinatorial design

Compressed sensing is the art of reconstructing structured n-dimensional...

Code Repositories