Controlling Moments with Kernel Stein Discrepancies

11/10/2022
by   Heishiro Kanagawa, et al.
0

Quantifying the deviation of a probability distribution is challenging when the target distribution is defined by a density with an intractable normalizing constant. The kernel Stein discrepancy (KSD) was proposed to address this problem and has been applied to various tasks including diagnosing approximate MCMC samplers and goodness-of-fit testing for unnormalized statistical models. This article investigates a convergence control property of the diffusion kernel Stein discrepancy (DKSD), an instance of the KSD proposed by Barp et al. (2019). We extend the result of Gorham and Mackey (2017), which showed that the KSD controls the bounded-Lipschitz metric, to functions of polynomial growth. Specifically, we prove that the DKSD controls the integral probability metric defined by a class of pseudo-Lipschitz functions, a polynomial generalization of Lipschitz functions. We also provide practical sufficient conditions on the reproducing kernel for the stated property to hold. In particular, we show that the DKSD detects non-convergence in moments with an appropriate kernel.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/28/2023

Local Lipschitz Filters for Bounded-Range Functions

We study local filters for the Lipschitz property of real-valued functio...
research
09/26/2022

Targeted Separation and Convergence with Kernel Discrepancies

Maximum mean discrepancies (MMDs) like the kernel Stein discrepancy (KSD...
research
06/27/2021

How many moments does MMD compare?

We present a new way of study of Mercer kernels, by corresponding to a s...
research
06/11/2019

Maximum Mean Discrepancy Gradient Flow

We construct a Wasserstein gradient flow of the maximum mean discrepancy...
research
10/08/2020

A low discrepancy sequence on graphs

Many applications such as election forecasting, environmental monitoring...

Please sign up or login with your details

Forgot password? Click here to reset