Sparse Prediction with the k-Support Norm

04/23/2012
by   Andreas Argyriou, et al.
0

We derive a novel norm that corresponds to the tightest convex relaxation of sparsity combined with an ℓ_2 penalty. We show that this new k-support norm provides a tighter relaxation than the elastic net and is thus a good replacement for the Lasso or the elastic net in sparse prediction problems. Through the study of the k-support norm, we also bound the looseness of the elastic net, thus shedding new light on it and providing justification for its use.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/07/2012

Sparsity by Worst-Case Penalties

This paper proposes a new interpretation of sparse penalties such as the...
research
07/19/2023

Lasso and elastic nets by orthants

We propose a new method for computing the lasso path, using the fact tha...
research
11/15/2015

Robust Elastic Net Regression

We propose a robust elastic net (REN) model for high-dimensional sparse ...
research
09/01/2017

Sparse Regularization in Marketing and Economics

Sparse alpha-norm regularization has many data-rich applications in mark...
research
06/02/2020

Feature-weighted elastic net: using "features of features" for better prediction

In some supervised learning settings, the practitioner might have additi...
research
09/09/2011

Trace Lasso: a trace norm regularization for correlated designs

Using the ℓ_1-norm to regularize the estimation of the parameter vector ...

Please sign up or login with your details

Forgot password? Click here to reset