Sobolev Norm Learning Rates for Conditional Mean Embeddings

05/16/2021
by   Prem Talwai, et al.
0

We develop novel learning rates for conditional mean embeddings by applying the theory of interpolation for reproducing kernel Hilbert spaces (RKHS). Our learning rates demonstrate consistency of the sample estimator under drastically weaker assumptions than the state-of-the art, allowing the much broader application of conditional mean embeddings to more complex ML/RL settings involving infinite dimensional RKHS and continuous state spaces.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/02/2019

A Rigorous Theory of Conditional Mean Embeddings

Conditional mean embeddings (CME) have proven themselves to be a powerfu...
research
08/02/2022

Optimal Rates for Regularized Conditional Mean Embedding Learning

We address the consistency of a kernel ridge regression estimate of the ...
research
05/21/2012

Conditional mean embeddings as regressors - supplementary

We demonstrate an equivalence between reproducing kernel Hilbert space (...
research
02/10/2020

A Measure-Theoretic Approach to Kernel Conditional Mean Embeddings

We present a new operator-free, measure-theoretic definition of the cond...
research
09/01/2018

Hyperparameter Learning for Conditional Mean Embeddings with Rademacher Complexity Bounds

Conditional mean embeddings are nonparametric models that encode conditi...
research
06/01/2019

Bayesian Deconditional Kernel Mean Embeddings

Conditional kernel mean embeddings form an attractive nonparametric fram...
research
10/23/2012

Further properties of Gaussian Reproducing Kernel Hilbert Spaces

We generalize the orthonormal basis for the Gaussian RKHS described in M...

Please sign up or login with your details

Forgot password? Click here to reset