Data-driven calibration of linear estimators with minimal penalties

09/10/2009
by   Sylvain Arlot, et al.
0

This paper tackles the problem of selecting among several linear estimators in non-parametric regression; this includes model selection for linear regression, the choice of a regularization parameter in kernel ridge regression, spline smoothing or locally weighted regression, and the choice of a kernel in multiple kernel learning. We propose a new algorithm which first estimates consistently the variance of the noise, based upon the concept of minimal penalty, which was previously introduced in the context of model selection. Then, plugging our variance estimate in Mallows' C_L penalty is proved to lead to an algorithm satisfying an oracle inequality. Simulation experiments with kernel ridge regression and multiple kernel learning show that the proposed algorithm often improves significantly existing calibration procedures such as generalized cross-validation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/27/2014

Convex Techniques for Model Selection

We develop a robust convex algorithm to select the regularization parame...
research
06/13/2020

Kernel Selection in Nonparametric Regression

In the regression model Y = b(X) +ε, where X has a density f, this paper...
research
01/22/2019

Minimal penalties and the slope heuristics: a survey

Birgé and Massart proposed in 2001 the slope heuristics as a way to choo...
research
02/26/2020

Aggregated hold out for sparse linear regression with a robust loss function

Sparse linear regression methods generally have a free hyperparameter wh...
research
08/02/2017

Streaming kernel regression with provably adaptive mean, variance, and regularization

We consider the problem of streaming kernel regression, when the observa...
research
07/24/2023

Adaptive debiased machine learning using data-driven model selection techniques

Debiased machine learning estimators for nonparametric inference of smoo...
research
05/24/2022

Bandwidth Selection for Gaussian Kernel Ridge Regression via Jacobian Control

Most machine learning methods depend on the tuning of hyper-parameters. ...

Please sign up or login with your details

Forgot password? Click here to reset