Sparse Recovery via Partial Regularization: Models, Theory and Algorithms

11/23/2015
by   Zhaosong Lu, et al.
0

In the context of sparse recovery, it is known that most of existing regularizers such as ℓ_1 suffer from some bias incurred by some leading entries (in magnitude) of the associated vector. To neutralize this bias, we propose a class of models with partial regularizers for recovering a sparse solution of a linear system. We show that every local minimizer of these models is sufficiently sparse or the magnitude of all its nonzero entries is above a uniform constant depending only on the data of the linear system. Moreover, for a class of partial regularizers, any global minimizer of these models is a sparsest solution to the linear system. We also establish some sufficient conditions for local or global recovery of the sparsest solution to the linear system, among which one of the conditions is weaker than the best known restricted isometry property (RIP) condition for sparse recovery by ℓ_1. In addition, a first-order feasible augmented Lagrangian (FAL) method is proposed for solving these models, in which each subproblem is solved by a nonmonotone proximal gradient (NPG) method. Despite the complication of the partial regularizers, we show that each proximal subproblem in NPG can be solved as a certain number of one-dimensional optimization problems, which usually have a closed-form solution. We also show that any accumulation point of the sequence generated by FAL is a first-order stationary point of the models. Numerical results on compressed sensing and sparse logistic regression demonstrate that the proposed models substantially outperform the widely used ones in the literature in terms of solution quality.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/10/2012

Sparse Approximation via Penalty Decomposition Methods

In this paper we consider sparse approximation problems, that is, genera...
research
08/03/2021

A Unified Study on L_1 over L_2 Minimization

In this paper, we carry out a unified study for L_1 over L_2 sparsity pr...
research
11/12/2019

Sparse estimation via ℓ_q optimization method in high-dimensional linear regression

In this paper, we discuss the statistical properties of the ℓ_q optimiza...
research
09/09/2019

Sparse linear regression with compressed and low-precision data via concave quadratic programming

We consider the problem of the recovery of a k-sparse vector from compre...
research
02/26/2021

Sparse Approximations with Interior Point Methods

Large-scale optimization problems that seek sparse solutions have become...
research
03/17/2022

Stability and Risk Bounds of Iterative Hard Thresholding

In this paper, we analyze the generalization performance of the Iterativ...
research
09/29/2015

Optimization over Sparse Symmetric Sets via a Nonmonotone Projected Gradient Method

We consider the problem of minimizing a Lipschitz differentiable functio...

Please sign up or login with your details

Forgot password? Click here to reset