Low Complexity Regularization of Linear Inverse Problems

07/07/2014
by   Samuel Vaiter, et al.
0

Inverse problems and regularization theory is a central theme in contemporary signal processing, where the goal is to reconstruct an unknown signal from partial indirect, and possibly noisy, measurements of it. A now standard method for recovering the unknown signal is to solve a convex optimization problem that enforces some prior knowledge about its structure. This has proved efficient in many problems routinely encountered in imaging sciences, statistics and machine learning. This chapter delivers a review of recent advances in the field where the regularization prior promotes solutions conforming to some notion of simplicity/low-complexity. These priors encompass as popular examples sparsity and group sparsity (to capture the compressibility of natural signals and images), total variation and analysis sparsity (to promote piecewise regularity), and low-rank (as natural extension of sparsity to matrix-valued data). Our aim is to provide a unified treatment of all these regularizations under a single umbrella, namely the theory of partial smoothness. This framework is very general and accommodates all low-complexity regularizers just mentioned, as well as many others. Partial smoothness turns out to be the canonical way to encode low-dimensional models that can be linear spaces or more general smooth manifolds. This review is intended to serve as a one stop shop toward the understanding of the theoretical properties of the so-regularized solutions. It covers a large spectrum including: (i) recovery guarantees and stability to noise, both in terms of ℓ^2-stability and model (manifold) identification; (ii) sensitivity analysis to perturbations of the parameters involved (in particular the observations), with applications to unbiased risk estimation ; (iii) convergence properties of the forward-backward proximal splitting scheme, that is particularly well suited to solve the corresponding large-scale regularized optimization problem.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/05/2014

Model Consistency of Partly Smooth Regularizers

This paper studies least-square regression penalized with partly smooth ...
research
02/14/2023

Sparse Bayesian Inference with Regularized Gaussian Distributions

Regularization is a common tool in variational inverse problems to impos...
research
09/11/2019

Goodness-of-fit tests on manifolds

We develop a general theory for the goodness-of-fit test to non-linear m...
research
09/19/2019

Non-smooth variational regularization for processing manifold-valued data

Many methods for processing scalar and vector valued images, volumes and...
research
07/11/2017

Sensitivity Analysis for Mirror-Stratifiable Convex Functions

This paper provides a set of sensitivity analysis and activity identific...
research
09/14/2016

Proceedings of the third "international Traveling Workshop on Interactions between Sparse models and Technology" (iTWIST'16)

The third edition of the "international - Traveling Workshop on Interact...
research
05/03/2022

Smooth over-parameterized solvers for non-smooth structured optimization

Non-smooth optimization is a core ingredient of many imaging or machine ...

Please sign up or login with your details

Forgot password? Click here to reset