Minimum Kernel Discrepancy Estimators

10/28/2022
by   Chris J. Oates, et al.
0

For two decades, reproducing kernels and their associated discrepancies have facilitated elegant theoretical analyses in the setting of quasi Monte Carlo. These same tools are now receiving interest in statistics and related fields, as criteria that can be used to select an appropriate statistical model for a given dataset. The focus of this article is on minimum kernel discrepancy estimators, whose use in statistical applications is reviewed, and a general theoretical framework for establishing their asymptotic properties is presented.

READ FULL TEXT
research
05/12/2021

Kernel Thinning

We introduce kernel thinning, a new procedure for compressing a distribu...
research
12/29/2014

Quasi-Monte Carlo Feature Maps for Shift-Invariant Kernels

We consider the problem of improving the efficiency of randomized Fourie...
research
06/13/2019

Statistical Inference for Generative Models with Maximum Mean Discrepancy

While likelihood-based inference and its variants provide a statisticall...
research
10/07/2022

A Roadmap to Asymptotic Properties with Applications to COVID-19 Data

Asymptotic properties of statistical estimators play a significant role ...
research
06/19/2019

Minimum Stein Discrepancy Estimators

When maximum likelihood estimation is infeasible, one often turns to sco...
research
02/28/2021

A Stein Goodness of fit Test for Exponential Random Graph Models

We propose and analyse a novel nonparametric goodness of fit testing pro...
research
05/21/2023

Kernel Stein Discrepancy on Lie Groups: Theory and Applications

Distributional approximation is a fundamental problem in machine learnin...

Please sign up or login with your details

Forgot password? Click here to reset