Convolutional Sparse Representations with Gradient Penalties

05/12/2017
by   Brendt Wohlberg, et al.
0

While convolutional sparse representations enjoy a number of useful properties, they have received limited attention for image reconstruction problems. The present paper compares the performance of block-based and convolutional sparse representations in the removal of Gaussian white noise. While the usual formulation of the convolutional sparse coding problem is slightly inferior to the block-based representations in this problem, the performance of the convolutional form can be boosted beyond that of the block-based form by the inclusion of suitable penalties on the gradients of the coefficient maps.

READ FULL TEXT
research
06/24/2017

A Variational EM Method for Pole-Zero Modeling of Speech with Mixed Block Sparse and Gaussian Excitation

The modeling of speech can be used for speech synthesis and speech recog...
research
07/20/2017

Convolutional Sparse Coding: Boundary Handling Revisited

Two different approaches have recently been proposed for boundary handli...
research
09/27/2017

Fast Convolutional Sparse Coding in the Dual Domain

Convolutional sparse coding (CSC) is an important building block of many...
research
12/16/2019

Penalized-likelihood PET Image Reconstruction Using 3D Structural Convolutional Sparse Coding

Positron emission tomography (PET) is widely used for clinical diagnosis...
research
05/29/2017

Distributed Convolutional Sparse Coding

We consider the problem of building shift-invariant representations for ...
research
06/10/2014

Optimization Methods for Convolutional Sparse Coding

Sparse and convolutional constraints form a natural prior for many optim...
research
10/03/2019

Sparse Popularity Adjusted Stochastic Block Model

The objective of the present paper is to study the Popularity Adjusted B...

Please sign up or login with your details

Forgot password? Click here to reset