Variance-Aware Estimation of Kernel Mean Embedding

10/13/2022
by   Geoffrey Wolfer, et al.
0

An important feature of kernel mean embeddings (KME) is that the rate of convergence of the empirical KME to the true distribution KME can be bounded independently of the dimension of the space, properties of the distribution and smoothness features of the kernel. We show how to speed-up convergence by leveraging variance information in the RKHS. Furthermore, we show that even when such information is a priori unknown, we can efficiently estimate it from the data, recovering the desiderata of a distribution agnostic bound that enjoys acceleration in fortuitous settings. We illustrate our methods in the context of hypothesis testing and robust parametric estimation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/28/2023

Finite-Sample Symmetric Mean Estimation with Fisher Information Rate

The mean of an unknown variance-σ^2 distribution f can be estimated from...
research
11/13/2017

Generalised empirical likelihood-based kernel density estimation

If additional information about the distribution of a random variable is...
research
02/16/2021

From Majorization to Interpolation: Distributionally Robust Learning using Kernel Smoothing

We study the function approximation aspect of distributionally robust op...
research
06/04/2013

Kernel Mean Estimation and Stein's Effect

A mean function in reproducing kernel Hilbert space, or a kernel mean, i...
research
02/10/2020

Robust Mean Estimation under Coordinate-level Corruption

Data corruption, systematic or adversarial, may skew statistical estimat...
research
08/09/2014

Optimally-Weighted Herding is Bayesian Quadrature

Herding and kernel herding are deterministic methods of choosing samples...
research
08/06/2018

Semblance: A Rank-Based Kernel on Probability Spaces for Niche Detection

Kernel methods provide a principled approach for detecting nonlinear rel...

Please sign up or login with your details

Forgot password? Click here to reset