Average case analysis of Lasso under ultra-sparse conditions

02/25/2023
by   Koki Okajima, et al.
0

We analyze the performance of the least absolute shrinkage and selection operator (Lasso) for the linear model when the number of regressors N grows larger keeping the true support size d finite, i.e., the ultra-sparse case. The result is based on a novel treatment of the non-rigorous replica method in statistical physics, which has been applied only to problem settings where N ,d and the number of observations M tend to infinity at the same rate. Our analysis makes it possible to assess the average performance of Lasso with Gaussian sensing matrices without assumptions on the scaling of N and M, the noise distribution, and the profile of the true signal. Under mild conditions on the noise distribution, the analysis also offers a lower bound on the sample complexity necessary for partial and perfect support recovery when M diverges as M = O(log N). The obtained bound for perfect support recovery is a generalization of that given in previous literature, which only considers the case of Gaussian noise and diverging d. Extensive numerical experiments strongly support our analysis.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/16/2021

On Model Selection Consistency of Lasso for High-Dimensional Ising Models on Tree-like Graphs

We consider the problem of high-dimensional Ising model selection using ...
research
05/14/2018

Model selection with lasso-zero: adding straw to the haystack to better find needles

The high-dimensional linear model y = X β^0 + ϵ is considered and the fo...
research
03/01/2023

The greedy side of the LASSO: New algorithms for weighted sparse recovery via loss function-based orthogonal matching pursuit

We propose a class of greedy algorithms for weighted sparse recovery by ...
research
04/30/2013

Dictionary LASSO: Guaranteed Sparse Recovery under Linear Transformation

We consider the following signal recovery problem: given a measurement m...
research
11/25/2015

L1-Regularized Least Squares for Support Recovery of High Dimensional Single Index Models with Gaussian Designs

It is known that for a certain class of single index models (SIMs) Y = f...
research
12/13/2018

On the sign recovery given by the thresholded LASSO and thresholded Basis Pursuit

We consider the regression model, when the number of observations is sma...
research
04/09/2020

Sparse recovery of noisy data and Grothendieck inequality

We present a detailed analysis of the unconstrained ℓ_1-method LASSO for...

Please sign up or login with your details

Forgot password? Click here to reset