On the Improved Rates of Convergence for Matérn-type Kernel Ridge Regression, with Application to Calibration of Computer Models

01/01/2020
by   Rui Tuo, et al.
0

Kernel ridge regression is an important nonparametric method for estimating smooth functions. We introduce a new set of conditions, under which the actual rates of convergence of the kernel ridge regression estimator under both the L_2 norm and the norm of the reproducing kernel Hilbert space exceed the standard minimax rates. An application of this theory leads to a new understanding of the Kennedy-O'Hagan approach for calibrating model parameters of computer simulation. We prove that, under certain conditions, the Kennedy-O'Hagan calibration estimator with a known covariance function converges to the minimizer of the norm of the residual function in the reproducing kernel Hilbert space.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/12/2023

On the Optimality of Misspecified Kernel Ridge Regression

In the misspecified kernel ridge regression problem, researchers usually...
research
08/16/2021

Uniform Function Estimators in Reproducing Kernel Hilbert Spaces

This paper addresses the problem of regression to reconstruct functions,...
research
09/22/2020

Risk upper bounds for RKHS ridge group sparse estimator in the regression model with non-Gaussian and non-bounded error

We consider the problem of estimating a meta-model of an unknown regress...
research
02/19/2019

Optimal Function-on-Scalar Regression over Complex Domains

In this work we consider the problem of estimating function-on-scalar re...
research
09/15/2011

Sampled forms of functional PCA in reproducing kernel Hilbert spaces

We consider the sampling problem for functional PCA (fPCA), where the si...
research
06/02/2020

Non-asymptotic Analysis in Kernel Ridge Regression

We develop a general non-asymptotic analysis of learning rates in kernel...

Please sign up or login with your details

Forgot password? Click here to reset