The Well Tempered Lasso

06/08/2018
by   Yuanzhi Li, et al.
0

We study the complexity of the entire regularization path for least squares regression with 1-norm penalty, known as the Lasso. Every regression parameter in the Lasso changes linearly as a function of the regularization value. The number of changes is regarded as the Lasso's complexity. Experimental results using exact path following exhibit polynomial complexity of the Lasso in the problem size. Alas, the path complexity of the Lasso on artificially designed regression problems is exponential. We use smoothed analysis as a mechanism for bridging the gap between worst case settings and the de facto low complexity. Our analysis assumes that the observed data has a tiny amount of intrinsic noise. We then prove that the Lasso's complexity is polynomial in the problem size. While building upon the seminal work of Spielman and Teng on smoothed complexity, our analysis is morally different as it is divorced from specific path following algorithms. We verify the validity of our analysis in experiments with both worst case settings and real datasets. The empirical results we obtain closely match our analysis.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/01/2012

Complexity Analysis of the Lasso Regularization Path

The regularization path of the Lasso can be shown to be piecewise linear...
research
06/09/2020

On Coresets For Regularized Regression

We study the effect of norm based regularization on the size of coresets...
research
08/05/2008

Support union recovery in high-dimensional multivariate regression

In multivariate regression, a K-dimensional response vector is regressed...
research
05/13/2020

Structure and Algorithm for Path of Solutions to a Class of Fused Lasso Problems

We study a class of fused lasso problems where the estimated parameters ...
research
07/19/2023

Lasso and elastic nets by orthants

We propose a new method for computing the lasso path, using the fact tha...
research
10/02/2017

Lasso Regularization Paths for NARMAX Models via Coordinate Descent

We propose a new algorithm for estimating NARMAX models with L1 regulari...
research
06/16/2020

Efficient Path Algorithms for Clustered Lasso and OSCAR

In high dimensional regression, feature clustering by their effects on o...

Please sign up or login with your details

Forgot password? Click here to reset