The Trimmed Lasso: Sparse Recovery Guarantees and Practical Optimization by the Generalized Soft-Min Penalty

05/18/2020
by   Tal Amir, et al.
0

We present a new approach to solve the sparse approximation or best subset selection problem, namely find a k-sparse vector x∈R^d that minimizes the ℓ_2 residual ‖ A x- y‖_2. We consider a regularized approach, whereby this residual is penalized by the non-convex trimmed lasso, defined as the ℓ_1-norm of x excluding its k largest-magnitude entries. We prove that the trimmed lasso has several appealing theoretical properties, and in particular derive sparse recovery guarantees assuming successful optimization of the penalized objective. Next, we show empirically that directly optimizing this objective can be quite challenging. Instead, we propose a surrogate for the trimmed lasso, called the generalized soft-min. This penalty smoothly interpolates between the classical lasso and the trimmed lasso, while taking into account all possible k-sparse patterns. The generalized soft-min penalty involves summation over dk terms, yet we derive a polynomial-time algorithm to compute it. This, in turn, yields a practical method for the original sparse approximation problem. Via simulations, we demonstrate its competitive performance compared to current state of the art.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/19/2023

Pattern Recovery in Penalized and Thresholded Estimation and its Geometry

We consider the framework of penalized estimation where the penalty term...
research
04/29/2022

Sparse-Group Log-Sum Penalized Graphical Model Learning For Time Series

We consider the problem of inferring the conditional independence graph ...
research
01/10/2020

Modifications of Prony's Method for the Recovery and Sparse Approximation of Generalized Exponential Sums

In this survey we describe some modifications of Prony's method. In part...
research
11/26/2014

A note relating ridge regression and OLS p-values to preconditioned sparse penalized regression

When the design matrix has orthonormal columns, "soft thresholding" the ...
research
01/29/2011

Recursive ℓ_1,∞ Group lasso

We introduce a recursive adaptive group lasso algorithm for real-time pe...
research
02/23/2017

Horseshoe Regularization for Feature Subset Selection

Feature subset selection arises in many high-dimensional applications of...
research
10/26/2022

Coordinate Descent for SLOPE

The lasso is the most famous sparse regression and feature selection met...

Please sign up or login with your details

Forgot password? Click here to reset