Learning Generative Prior with Latent Space Sparsity Constraints

05/25/2021
by   Vinayak Killedar, et al.
0

We address the problem of compressed sensing using a deep generative prior model and consider both linear and learned nonlinear sensing mechanisms, where the nonlinear one involves either a fully connected neural network or a convolutional neural network. Recently, it has been argued that the distribution of natural images do not lie in a single manifold but rather lie in a union of several submanifolds. We propose a sparsity-driven latent space sampling (SDLSS) framework and develop a proximal meta-learning (PML) algorithm to enforce sparsity in the latent space. SDLSS allows the range-space of the generator to be considered as a union-of-submanifolds. We also derive the sample complexity bounds within the SDLSS framework for the linear measurement model. The results demonstrate that for a higher degree of compression, the SDLSS method is more efficient than the state-of-the-art method. We first consider a comparison between linear and nonlinear sensing mechanisms on Fashion-MNIST dataset and show that the learned nonlinear version is superior to the linear one. Subsequent comparisons with the deep compressive sensing (DCS) framework proposed in the literature are reported. We also consider the effect of the dimension of the latent space and the sparsity factor in validating the SDLSS framework. Performance quantification is carried out by employing three objective metrics: peak signal-to-noise ratio (PSNR), structural similarity index metric (SSIM), and reconstruction error (RE).

READ FULL TEXT

page 4

page 10

research
02/22/2021

Generator Surgery for Compressed Sensing

Image recovery from compressive measurements requires a signal prior for...
research
10/31/2017

Latent Space Oddity: on the Curvature of Deep Generative Models

Deep generative models provide a systematic way to learn nonlinear data ...
research
06/28/2022

Equivariant Priors for Compressed Sensing with Unknown Orientation

In compressed sensing, the goal is to reconstruct the signal from an und...
research
03/04/2022

Sparsity-Inducing Categorical Prior Improves Robustness of the Information Bottleneck

The information bottleneck framework provides a systematic approach to l...
research
06/20/2021

Generative Model Adversarial Training for Deep Compressed Sensing

Deep compressed sensing assumes the data has sparse representation in a ...
research
06/16/2018

Latent Convolutional Models

We present a new latent model of natural images that can be learned on l...
research
10/13/2020

Deep generative demixing: Recovering Lipschitz signals from noisy subgaussian mixtures

Generative neural networks (GNNs) have gained renown for efficaciously c...

Please sign up or login with your details

Forgot password? Click here to reset