Convergence guarantees for kernel-based quadrature rules in misspecified settings

05/24/2016
by   Motonobu Kanagawa, et al.
0

Kernel-based quadrature rules are becoming important in machine learning and statistics, as they achieve super-√(n) convergence rates in numerical integration, and thus provide alternatives to Monte Carlo integration in challenging settings where integrands are expensive to evaluate or where integrands are high dimensional. These rules are based on the assumption that the integrand has a certain degree of smoothness, which is expressed as that the integrand belongs to a certain reproducing kernel Hilbert space (RKHS). However, this assumption can be violated in practice (e.g., when the integrand is a black box function), and no general theory has been established for the convergence of kernel quadratures in such misspecified settings. Our contribution is in proving that kernel quadratures can be consistent even when the integrand does not belong to the assumed RKHS, i.e., when the integrand is less smooth than assumed. Specifically, we derive convergence rates that depend on the (unknown) lesser smoothness of the integrand, where the degree of smoothness is expressed via powers of RKHSs or via Sobolev spaces.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/01/2017

Convergence Analysis of Deterministic Kernel-Based Quadrature Rules in Misspecified Settings

This paper presents convergence analysis of kernel-based quadrature rule...
research
08/02/2023

Improved convergence rates of nonparametric penalized regression under misspecified total variation

Penalties that induce smoothness are common in nonparametric regression....
research
03/23/2022

Stability of convergence rates: Kernel interpolation on non-Lipschitz domains

Error estimates for kernel interpolation in Reproducing Kernel Hilbert S...
research
06/03/2018

Analysis of regularized Nyström subsampling for regression functions of low smoothness

This paper studies a Nyström type subsampling approach to large kernel l...
research
05/16/2021

Analysis of target data-dependent greedy kernel algorithms: Convergence rates for f-, f · P- and f/P-greedy

Data-dependent greedy algorithms in kernel spaces are known to provide f...
research
03/04/2022

Convergence Rates for Oversmoothing Banach Space Regularization

This paper studies Tikhonov regularization for finitely smoothing operat...
research
06/15/2023

Convergence of one-level and multilevel unsymmetric collocation for second order elliptic boundary value problems

Thepaperprovesconvergenceofone-levelandmultilevelunsymmetriccollocationf...

Please sign up or login with your details

Forgot password? Click here to reset