Generalization Error Rates in Kernel Regression: The Crossover from the Noiseless to Noisy Regime

05/31/2021
by   Hugo Cui, et al.
3

In this manuscript we consider Kernel Ridge Regression (KRR) under the Gaussian design. Exponents for the decay of the excess generalization error of KRR have been reported in various works under the assumption of power-law decay of eigenvalues of the features co-variance. These decays were, however, provided for sizeably different setups, namely in the noiseless case with constant regularization and in the noisy optimally regularized case. Intermediary settings have been left substantially uncharted. In this work, we unify and extend this line of work, providing characterization of all regimes and excess error decay rates that can be observed in terms of the interplay of noise and regularization. In particular, we show the existence of a transition in the noisy setting between the noiseless exponents to its noisy values as the sample complexity is increased. Finally, we illustrate how this crossover can also be observed on real data sets.

READ FULL TEXT

page 3

page 20

research
01/29/2022

Error Rates for Kernel Classification under Source and Capacity Conditions

In this manuscript, we consider the problem of kernel classification und...
research
10/23/2021

Learning curves for Gaussian process regression with power-law priors and targets

We study the power-law asymptotics of learning curves for Gaussian proce...
research
11/12/2016

Kernel regression, minimax rates and effective dimensionality: beyond the regular case

We investigate if kernel regularization methods can achieve minimax conv...
research
10/22/2018

A jamming transition from under- to over-parametrization affects loss landscape and generalization

We argue that in fully-connected networks a phase transition delimits th...
research
01/17/2023

A Distribution Free Truncated Kernel Ridge Regression Estimator and Related Spectral Analyses

It is well known that kernel ridge regression (KRR) is a popular nonpara...
research
12/01/2022

Regularization with Fake Features

Recent successes of massively overparameterized models have inspired a n...
research
06/22/2023

An Agnostic View on the Cost of Overfitting in (Kernel) Ridge Regression

We study the cost of overfitting in noisy kernel ridge regression (KRR),...

Please sign up or login with your details

Forgot password? Click here to reset