A Kernelized Stein Discrepancy for Goodness-of-fit Tests and Model Evaluation

02/10/2016
by   Qiang Liu, et al.
0

We derive a new discrepancy statistic for measuring differences between two probability distributions based on combining Stein's identity with the reproducing kernel Hilbert space theory. We apply our result to test how well a probabilistic model fits a set of observations, and derive a new class of powerful goodness-of-fit tests that are widely applicable for complex and high dimensional distributions, even for those with computationally intractable normalization constants. Both theoretical and empirical properties of our methods are studied thoroughly.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/14/2017

Two-sample Statistics Based on Anisotropic Kernels

The paper introduces a new kernel-based Maximum Mean Discrepancy (MMD) s...
research
02/17/2020

A Stein Goodness-of-fit Test for Directional Distributions

In many fields, data appears in the form of direction (unit vector) and ...
research
02/05/2021

Active Slices for Sliced Stein Discrepancy

Sliced Stein discrepancy (SSD) and its kernelized variants have demonstr...
research
05/05/2020

Measuring the Discrepancy between Conditional Distributions: Methods, Properties and Applications

We propose a simple yet powerful test statistic to quantify the discrepa...
research
03/29/2018

Bayesian Goodness of Fit Tests: A Conversation for David Mumford

The problem of making practical, useful goodness of fit tests in the Bay...
research
10/11/2022

On RKHS Choices for Assessing Graph Generators via Kernel Stein Statistics

Score-based kernelised Stein discrepancy (KSD) tests have emerged as a p...

Please sign up or login with your details

Forgot password? Click here to reset