Sparse Recovery via Differential Inclusions

06/30/2014
by   Stanley Osher, et al.
0

In this paper, we recover sparse signals from their noisy linear measurements by solving nonlinear differential inclusions, which is based on the notion of inverse scale space (ISS) developed in applied mathematics. Our goal here is to bring this idea to address a challenging problem in statistics, i.e. finding the oracle estimator which is unbiased and sign-consistent using dynamics. We call our dynamics Bregman ISS and Linearized Bregman ISS. A well-known shortcoming of LASSO and any convex regularization approaches lies in the bias of estimators. However, we show that under proper conditions, there exists a bias-free and sign-consistent point on the solution paths of such dynamics, which corresponds to a signal that is the unbiased estimate of the true signal and whose entries have the same signs as those of the true signs, i.e. the oracle estimator. Therefore, their solution paths are regularization paths better than the LASSO regularization path, since the points on the latter path are biased when sign-consistency is reached. We also show how to efficiently compute their solution paths in both continuous and discretized settings: the full solution paths can be exactly computed piece by piece, and a discretization leads to Linearized Bregman iteration, which is a simple iterative thresholding rule and easy to parallelize. Theoretical guarantees such as sign-consistency and minimax optimal l_2-error bounds are established in both continuous and discrete settings for specific points on the paths. Early-stopping rules for identifying these points are given. The key treatment relies on the development of differential inequalities for differential inclusions and their discretizations, which extends the previous results and leads to exponentially fast recovering of sparse signals before selecting wrong ones.

READ FULL TEXT
research
10/08/2018

A Unified Dynamic Approach to Sparse Model Selection

Sparse model selection is ubiquitous from linear regression to graphical...
research
03/18/2014

On the Sensitivity of the Lasso to the Number of Predictor Variables

The Lasso is a computationally efficient regression regularization proce...
research
03/24/2023

Sign-consistent estimation in a sparse Poisson model

In this work, we consider an estimation method in sparse Poisson models ...
research
12/29/2021

A comment and erratum on "Excess Optimism: How Biased is the Apparent Error of an Estimator Tuned by SURE?"

We identify and correct an error in the paper "Excess Optimism: How Bias...
research
05/08/2021

Nearly Minimax-Optimal Rates for Noisy Sparse Phase Retrieval via Early-Stopped Mirror Descent

This paper studies early-stopped mirror descent applied to noisy sparse ...
research
08/30/2017

Graphical Lasso and Thresholding: Equivalence and Closed-form Solutions

Graphical Lasso (GL) is a popular method for learning the structure of a...
research
04/16/2017

Boosting with Structural Sparsity: A Differential Inclusion Approach

Boosting as gradient descent algorithms is one popular method in machine...

Please sign up or login with your details

Forgot password? Click here to reset