Learning Sparse and Low-Rank Priors for Image Recovery via Iterative Reweighted Least Squares Minimization

04/20/2023
āˆ™
by   Stamatios Lefkimmiatis, et al.
āˆ™
0
āˆ™

We introduce a novel optimization algorithm for image recovery under learned sparse and low-rank constraints, which we parameterize as weighted extensions of the ā„“_p^p-vector and š’®_p^p Schatten-matrix quasi-norms for 0<pā‰¤1, respectively. Our proposed algorithm generalizes the Iteratively Reweighted Least Squares (IRLS) method, used for signal recovery under ā„“_1 and nuclear-norm constrained minimization. Further, we interpret our overall minimization approach as a recurrent network that we then employ to deal with inverse low-level computer vision problems. Thanks to the convergence guarantees that our IRLS strategy offers, we are able to train the derived reconstruction networks using a memory-efficient implicit back-propagation scheme, which does not pose any restrictions on their effective depth. To assess our networks' performance, we compare them against other existing reconstruction methods on several inverse problems, namely image deblurring, super-resolution, demosaicking and sparse recovery. Our reconstruction results are shown to be very competitive and in many cases outperform those of existing unrolled networks, whose number of parameters is orders of magnitude higher than that of our learned models.

READ FULL TEXT

page 24

page 25

page 26

page 27

page 28

page 29

page 30

page 31

research
āˆ™ 08/10/2023

Iterative Reweighted Least Squares Networks With Convergence Guarantees for Solving Inverse Imaging Problems

In this work we present a novel optimization strategy for image reconstr...
research
āˆ™ 01/29/2014

Smoothed Low Rank and Sparse Matrix Recovery by Iteratively Reweighted Least Squares Minimization

This work presents a general framework for solving the low rank and/or s...
research
āˆ™ 08/11/2021

The Lawson-Hanson Algorithm with Deviation Maximization: Finite Convergence and Sparse Recovery

In this work we apply the "deviation maximization", a new column selecti...
research
āˆ™ 07/14/2014

Performance Guarantees for Schatten-p Quasi-Norm Minimization in Recovery of Low-Rank Matrices

We address some theoretical guarantees for Schatten-p quasi-norm minimiz...
research
āˆ™ 01/13/2021

DAEs for Linear Inverse Problems: Improved Recovery with Provable Guarantees

Generative priors have been shown to provide improved results over spars...
research
āˆ™ 11/15/2016

Constrained Low-Rank Learning Using Least Squares-Based Regularization

Low-rank learning has attracted much attention recently due to its effic...
research
āˆ™ 06/28/2021

Asymptotic Log-Det Rank Minimization via (Alternating) Iteratively Reweighted Least Squares

The affine rank minimization (ARM) problem is well known for both its ap...

Please sign up or login with your details

Forgot password? Click here to reset