A new concentration inequality for the excess risk in least-squares regression with random design and heteroscedastic noise

02/16/2017
by   Adrien Saumard, et al.
0

We prove a new concentration inequality for the excess risk of a M-estimator in least-squares regression with random design and heteroscedastic noise. This kind of result is a central tool in modern model selection theory, as well as in recent achievements concerning the behavior of regularized estimators such as LASSO, group LASSO and SLOPE.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/12/2020

A matrix concentration inequality for products

We present a non-asymptotic concentration inequality for the random matr...
research
06/16/2022

Universality of regularized regression estimators in high dimensions

The Convex Gaussian Min-Max Theorem (CGMT) has emerged as a prominent th...
research
06/09/2020

On Coresets For Regularized Regression

We study the effect of norm based regularization on the size of coresets...
research
05/16/2007

Lasso type classifiers with a reject option

We consider the problem of binary classification where one can, for a pa...
research
09/30/2019

Rejoinder on: Minimal penalties and the slope heuristics: a survey

This text is the rejoinder following the discussion of a survey paper ab...
research
12/08/2020

A Concentration Inequality for the Facility Location Problem

We give a concentration inequality for a stochastic version of the facil...
research
12/03/2019

Bayesian Model Selection for Change Point Detection and Clustering

We address the new problem of estimating a piece-wise constant signal wi...

Please sign up or login with your details

Forgot password? Click here to reset