A Primal Dual Active Set Algorithm for a Class of Nonconvex Sparsity Optimization

10/04/2013
by   Yuling Jiao, et al.
0

In this paper, we consider the problem of recovering a sparse vector from noisy measurement data. Traditionally, it is formulated as a penalized least-squares problem with an ℓ^1 penalty. Recent studies show that nonconvex penalties, e.g., ℓ^0 and bridge, allow more effective sparse recovery. We develop an algorithm of primal dual active set type for a class of nonconvex sparsity-promoting penalties, which cover ℓ^0, bridge, smoothly clipped absolute deviation, capped ℓ^1 and minimax concavity penalty. First we establish the existence of a global minimizer for the class of optimization problems. Then we derive a novel necessary optimality condition for the global minimizer using the associated thresholding operator. The solutions to the optimality system are coordinate-wise minimizers, and under minor conditions, they are also local minimizers. Upon introducing the dual variable, the active set can be determined from the primal and dual variables. This relation lends itself to an iterative algorithm of active set type which at each step involves updating the primal variable only on the active set and then updating the dual variable explicitly. When combined with a continuation strategy on the regularization parameter, it has a global convergence property under the restricted isometry property. Extensive numerical experiments demonstrate its efficiency and accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/03/2014

A Primal Dual Active Set with Continuation Algorithm for the ℓ^0-Regularized Optimization Problem

We develop a primal dual active set with continuation algorithm for solv...
research
10/20/2022

Block-wise Primal-dual Algorithms for Large-scale Doubly Penalized ANOVA Modeling

For multivariate nonparametric regression, doubly penalized ANOVA modeli...
research
11/11/2021

Nonconvex flexible sparsity regularization: theory and monotone numerical schemes

Flexible sparsity regularization means stably approximating sparse solut...
research
12/13/2017

Penalty Dual Decomposition Method For Nonsmooth Nonconvex Optimization

Many contemporary signal processing, machine learning and wireless commu...
research
03/21/2017

Stochastic Primal Dual Coordinate Method with Non-Uniform Sampling Based on Optimality Violations

We study primal-dual type stochastic optimization algorithms with non-un...
research
09/11/2018

An Efficient ADMM-Based Algorithm to Nonconvex Penalized Support Vector Machines

Support vector machines (SVMs) with sparsity-inducing nonconvex penaltie...
research
12/09/2021

Continuation Path with Linear Convergence Rate

Path-following algorithms are frequently used in composite optimization ...

Please sign up or login with your details

Forgot password? Click here to reset