On the best choice of Lasso program given data parameters

10/17/2020
by   Aaron Berk, et al.
0

Generalized compressed sensing (GCS) is a paradigm in which a structured high-dimensional signal may be recovered from random, under-determined, and corrupted linear measurements. Generalized Lasso (GL) programs are effective for solving GCS problems due to their proven ability to leverage underlying signal structure. Three popular GL programs are equivalent in a sense and sometimes used interchangeably. Tuned by a governing parameter, each admit an optimal parameter choice. For sparse or low-rank signal structures, this choice yields minimax order-optimal error. While GCS is well-studied, existing theory for GL programs typically concerns this optimally tuned setting. However, the optimal parameter value for a GL program depends on properties of the data, and is typically unknown in practical settings. Performance in empirical problems thus hinges on a program's parameter sensitivity: it is desirable that small variation about the optimal parameter choice begets small variation about the optimal risk. We examine the risk for these three programs and demonstrate that their parameter sensitivity can differ for the same data. We prove a gauge-constrained GL program admits asymptotic cusp-like behaviour of its risk in the limiting low-noise regime. We prove that a residual-constrained Lasso program has asymptotically suboptimal risk for very sparse vectors. These results contrast observations about an unconstrained Lasso program, which is relatively less sensitive to its parameter choice. We support the asymptotic theory with numerical simulations, demonstrating that parameter sensitivity of GL programs is readily observed for even modest dimensional parameters. Importantly, these simulations demonstrate regimes in which a GL program exhibits sensitivity to its parameter choice, though the other two do not. We hope this work aids practitioners in selecting a GL program for their problem.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/29/2018

Parameter instability regimes for sparse proximal denoising programs

Compressed sensing theory explains why Lasso programs recover structured...
research
05/13/2022

LASSO reloaded: a variational analysis perspective with applications to compressed sensing

This paper provides a variational analysis of the unconstrained formulat...
research
05/11/2013

Corrupted Sensing: Novel Guarantees for Separating Structured Signals

We study the problem of corrupted sensing, a generalization of compresse...
research
03/18/2014

On the Sensitivity of the Lasso to the Number of Predictor Variables

The Lasso is a computationally efficient regression regularization proce...
research
11/03/2018

The distribution of the Lasso: Uniform control over sparse balls and adaptive parameter tuning

The Lasso is a popular regression method for high-dimensional problems i...
research
03/27/2023

Square Root LASSO: Well-posedness, Lipschitz stability and the tuning trade off

This paper studies well-posedness and parameter sensitivity of the Squar...
research
12/05/2014

Quantile universal threshold: model selection at the detection edge for high-dimensional linear regression

To estimate a sparse linear model from data with Gaussian noise, consili...

Please sign up or login with your details

Forgot password? Click here to reset