Risk-consistency of cross-validation with lasso-type procedures

08/04/2013
by   Darren Homrighausen, et al.
0

The lasso and related sparsity inducing algorithms have been the target of substantial theoretical and applied research. Correspondingly, many results are known about their behavior for a fixed or optimally chosen tuning parameter specified up to unknown constants. In practice, however, this oracle tuning parameter is inaccessible so one must use the data to select one. Common statistical practice is to use a variant of cross-validation for this task. However, little is known about the theoretical properties of the resulting predictions with such data-dependent methods. We consider the high-dimensional setting with random design wherein the number of predictors p grows with the number of observations n. Under typical assumptions on the data generating process, similar to those in the literature, we recover oracle rates up to a log factor when choosing the tuning parameter with cross-validation. Under weaker conditions, when the true model is not necessarily linear, we show that the lasso remains risk consistent relative to its linear oracle. We also generalize these results to the group lasso and square-root lasso and investigate the predictive and model selection performance of cross-validation via simulation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/20/2020

On the use of cross-validation for the calibration of the tuning parameter in the adaptive lasso

The adaptive lasso is a popular extension of the lasso, which was shown ...
research
02/04/2016

Risk estimation for high-dimensional lasso regression

In high-dimensional estimation, analysts are faced with more parameters ...
research
03/13/2013

Estimation Stability with Cross Validation (ESCV)

Cross-validation (CV) is often used to select the regularization paramet...
research
02/27/2023

Extrapolated cross-validation for randomized ensembles

Ensemble methods such as bagging and random forests are ubiquitous in fi...
research
10/10/2018

ET-Lasso: Efficient Tuning of Lasso for High-Dimensional Data

The L1 regularization (Lasso) has proven to be a versatile tool to selec...
research
10/30/2014

Robust sketching for multiple square-root LASSO problems

Many learning tasks, such as cross-validation, parameter search, or leav...
research
01/07/2022

A Cross Validation Framework for Signal Denoising with Applications to Trend Filtering, Dyadic CART and Beyond

This paper formulates a general cross validation framework for signal de...

Please sign up or login with your details

Forgot password? Click here to reset