Aggregated hold out for sparse linear regression with a robust loss function

02/26/2020
by   Guillaume Maillard, et al.
0

Sparse linear regression methods generally have a free hyperparameter which controls the amount of sparsity, and is subject to a bias-variance tradeoff. This article considers the use of Aggregated hold-out to aggregate over values of this hyperparameter, in the context of linear regression with the Huber loss function. Aggregated hold-out (Agghoo) is a procedure which averages estimators selected by hold-out (cross-validation with a single split). In the theoretical part of the article, it is proved that Agghoo satisfies a non-asymptotic oracle inequality when it is applied to sparse estimators which are parametrized by their zero-norm. In particular , this includes a variant of the Lasso introduced by Zou, Hastié and Tibshirani. Simulations are used to compare Agghoo with cross-validation. They show that Agghoo performs better than CV when the intrinsic dimension is high and when there are confounders correlated with the predictive covariates.

READ FULL TEXT

page 18

page 19

page 37

research
09/11/2019

Aggregated Hold-Out

Aggregated hold-out (Agghoo) is a method which averages learning rules s...
research
07/31/2020

On regularization methods based on Rényi's pseudodistances for sparse high-dimensional linear regression models

Several regularization methods have been considered over the last decade...
research
09/10/2009

Data-driven calibration of linear estimators with minimal penalties

This paper tackles the problem of selecting among several linear estimat...
research
11/04/2022

Concentration inequalities for leave-one-out cross validation

In this article we prove that estimator stability is enough to show that...
research
05/30/2023

Asymptotic Characterisation of Robust Empirical Risk Minimisation Performance in the Presence of Outliers

We study robust linear regression in high-dimension, when both the dimen...
research
07/19/2021

Can we globally optimize cross-validation loss? Quasiconvexity in ridge regression

Models like LASSO and ridge regression are extensively used in practice ...

Please sign up or login with your details

Forgot password? Click here to reset