
On the Convergence Rate of Projected Gradient Descent for a BackProjection based Objective
Illposed linear inverse problems appear in many fields of imaging scien...
read it

Intermediate Layer Optimization for Inverse Problems using Deep Generative Models
We propose Intermediate Layer Optimization (ILO), a novel optimization a...
read it

An Interpretation of Regularization by Denoising and its Application with the BackProjected Fidelity Term
The vast majority of image recovery tasks are illposed problems. As suc...
read it

Adaptive Quantile Sparse Image (AQuaSI) Prior for Inverse Imaging Problems
Inverse problems play a central role for many classical computer vision ...
read it

Neumann Networks for Inverse Problems in Imaging
Many challenging image processing tasks can be described by an illposed...
read it

SUNLayer: Stable denoising with generative networks
It has been experimentally established that deep neural networks can be ...
read it

Clustering in statistical illposed linear inverse problems
In many statistical linear inverse problems, one needs to recover classe...
read it
BackProjection based Fidelity Term for IllPosed Linear Inverse Problems
Illposed linear inverse problems appear in many image processing applications, such as deblurring, superresolution and compressed sensing. Many restoration strategies involve minimizing a cost function, which is composed of fidelity and prior terms, balanced by a regularization parameter. While a vast amount of research has been focused on different prior models, the fidelity term is almost always chosen to be the least squares (LS) objective, that encourages fitting the linearly transformed optimization variable to the observations. In this work, we examine a different fidelity term, which has been implicitly used by the recently proposed iterative denoising and backward projections (IDBP) framework. This term encourages agreement between the projection of the optimization variable onto the row space of the linear operator and the pseudoinverse of the linear operator ("backprojection") applied on the observations. We analytically examine the difference between the two fidelity terms for Tikhonov regularization and identify cases where the new term has an advantage over the standard LS one. Moreover, we demonstrate empirically that the behavior of the two induced cost functions for sophisticated convex and nonconvex priors, such as totalvariation, BM3D, and deep generative models, correlates with the obtained theoretical analysis.
READ FULL TEXT
Comments
There are no comments yet.