Early stopping and polynomial smoothing in regression with reproducing kernels

07/14/2020
by   Yaroslav Averyanov, et al.
0

In this paper we study the problem of early stopping for iterative learning algorithms in reproducing kernel Hilbert space (RKHS) in the nonparametric regression framework. In particular, we work with gradient descent and (iterative) kernel ridge regression algorithms. We present a data-driven rule to perform early stopping without a validation set that is based on the so-called minimum discrepancy principle. This method enjoys only one assumption on the regression function: it belongs to a reproducing kernel Hilbert space (RKHS). The proposed rule is proved to be minimax optimal over different types of kernel spaces, including finite rank and Sobolev smoothness classes. The proof is derived from the fixed-point analysis of the localized Rademacher complexities, which is a standard technique for obtaining optimal rates in the nonparametric regression literature. In addition to that, we present simulations results on artificial datasets that show comparable performance of the designed rule with respect to other stopping rules such as the one determined by V-fold cross-validation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/15/2013

Early stopping and non-parametric regression: An optimal data-dependent stopping rule

The strategy of early stopping is a regularization technique based on ch...
research
07/05/2017

Early stopping for kernel boosting algorithms: A general analysis with localized complexities

Early stopping of iterative algorithms is a widely-used form of regulari...
research
04/17/2020

Analyzing the discrepancy principle for kernelized spectral filter learning algorithms

We investigate the construction of early stopping rules in the nonparame...
research
08/20/2020

Minimum discrepancy principle strategy for choosing k in k-NN regression

This paper presents a novel data-driven strategy to choose the hyperpara...
research
05/25/2018

Early Stopping for Nonparametric Testing

Early stopping of iterative algorithms is an algorithmic regularization ...
research
03/20/2018

V-Splines and Bayes Estimate

Smoothing splines can be thought of as the posterior mean of a Gaussian ...
research
12/28/2021

Ensemble Recognition in Reproducing Kernel Hilbert Spaces through Aggregated Measurements

In this paper, we study the problem of learning dynamical properties of ...

Please sign up or login with your details

Forgot password? Click here to reset