Non-asymptotic Analysis in Kernel Ridge Regression

06/02/2020
by   Zejian Liu, et al.
0

We develop a general non-asymptotic analysis of learning rates in kernel ridge regression (KRR), applicable for arbitrary Mercer kernels with multi-dimensional support. Our analysis is based on an operator-theoretic framework, at the core of which lies two error bounds under reproducing kernel Hilbert space norms encompassing a general class of kernels and regression functions, with remarkable extensibility to various inferential goals through augmenting results. When applied to KRR estimators, our analysis leads to error bounds under the stronger supremum norm, in addition to the commonly studied weighted L_2 norm; in a concrete example specialized to the Matérn kernel, the established bounds recover the nearly minimax optimal rates. The wide applicability of our analysis is further demonstrated through two new theoretical results: (1) non-asymptotic learning rates for mixed partial derivatives of KRR estimators, and (2) a non-asymptotic characterization of the posterior variances of Gaussian processes, which corresponds to uncertainty quantification in kernel methods and nonparametric Bayes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/27/2020

Equivalence of Convergence Rates of Posterior Distributions and Bayes Estimators for Functions and Nonparametric Functionals

We study the posterior contraction rates of a Bayesian method with Gauss...
research
05/25/2019

Kernel Truncated Randomized Ridge Regression: Optimal Rates and Low Noise Acceleration

In this paper, we consider the nonparametric least square regression in ...
research
01/01/2020

On the Improved Rates of Convergence for Matérn-type Kernel Ridge Regression, with Application to Calibration of Computer Models

Kernel ridge regression is an important nonparametric method for estimat...
research
04/09/2021

How rotational invariance of common kernels prevents generalization in high dimensions

Kernel ridge regression is well-known to achieve minimax optimal rates i...
research
03/30/2021

Minimum complexity interpolation in random features models

Despite their many appealing properties, kernel methods are heavily affe...
research
10/13/2021

Spectral-norm risk rates for multi-taper estimation of Gaussian processes

We consider the estimation of the covariance of a stationary Gaussian pr...
research
08/10/2020

Deterministic error bounds for kernel-based learning techniques under bounded noise

We consider the problem of reconstructing a function from a finite set o...

Please sign up or login with your details

Forgot password? Click here to reset