Near-Ideal Behavior of Compressed Sensing Algorithms

01/26/2014
by   Mehmet Eren Ahsen, et al.
0

In a recent paper, it is shown that the LASSO algorithm exhibits "near-ideal behavior," in the following sense: Suppose y = Az + η where A satisfies the restricted isometry property (RIP) with a sufficiently small constant, and η_2 ≤ϵ. Then minimizing z _1 subject to y - Az _2 ≤ϵ leads to an estimate x̂ whose error x̂ - x _2 is bounded by a universal constant times the error achieved by an "oracle" that knows the location of the nonzero components of x. In the world of optimization, the LASSO algorithm has been generalized in several directions such as the group LASSO, the sparse group LASSO, either without or with tree-structured overlapping groups, and most recently, the sorted LASSO. In this paper, it is shown that any algorithm exhibits near-ideal behavior in the above sense, provided only that (i) the norm used to define the sparsity index is "decomposable," (ii) the penalty norm that is minimized in an effort to enforce sparsity is "γ-decomposable," and (iii) a "compressibility condition" in terms of a group restricted isometry property is satisfied. Specifically, the group LASSO, and the sparse group LASSO (with some permissible overlap in the groups), as well as the sorted ℓ_1-norm minimization all exhibit near-ideal behavior. Explicit bounds on the residual error are derived that contain previously known results as special cases.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/29/2015

Error Bounds for Compressed Sensing Algorithms With Group Sparsity: A Unified Approach

In compressed sensing, in order to recover a sparse or nearly sparse vec...
research
09/12/2022

The Sparsity of LASSO-type Minimizers

This note extends an attribute of the LASSO procedure to a whole class o...
research
10/03/2011

Group Lasso with Overlaps: the Latent Group Lasso approach

We study a norm for structured sparsity which leads to sparse linear pre...
research
08/08/2020

Error Bounds for Generalized Group Sparsity

In high-dimensional statistical inference, sparsity regularizations have...
research
06/25/2020

Sparse Convex Optimization via Adaptively Regularized Hard Thresholding

The goal of Sparse Convex Optimization is to optimize a convex function ...
research
04/06/2012

Fast projections onto mixed-norm balls with applications

Joint sparsity offers powerful structural cues for feature selection, es...
research
05/21/2019

Exploring the effects of Lx-norm penalty terms in multivariate curve resolution methods for resolving LC/GC-MS data

There are different problems for resolution of complex LC-MS or GC-MS da...

Please sign up or login with your details

Forgot password? Click here to reset