DeepAI AI Chat
Log In Sign Up

Metrizing Weak Convergence with Maximum Mean Discrepancies

by   Carl-Johann Simon-Gabriel, et al.
ETH Zurich
Imperial College London

Theorem 12 of Simon-Gabriel Schölkopf (JMLR, 2018) seemed to close a 40-year-old quest to characterize maximum mean discrepancies (MMD) that metrize the weak convergence of probability measures. We prove, however, that the theorem is incorrect and provide a correction. We show that, on a locally compact, non-compact, Hausdorff space, the MMD of a bounded continuous Borel measurable kernel k, whose RKHS-functions vanish at infinity, metrizes the weak convergence of probability measures if and only if k is continuous and integrally strictly positive definite (ISPD) over all signed, finite, regular Borel measures. We also show that, contrary to the claim of the aforementioned Theorem 12, there exist both bounded continuous ISPD kernels that do not metrize weak convergence and bounded continuous non-ISPD kernels that do metrize it.


page 1

page 2

page 3

page 4


On the speed of uniform convergence in Mercer's theorem

The classical Mercer's theorem claims that a continuous positive definit...

Exponentially Consistent Kernel Two-Sample Tests

Given two sets of independent samples from unknown distributions P and Q...

Targeted Separation and Convergence with Kernel Discrepancies

Maximum mean discrepancies (MMDs) like the kernel Stein discrepancy (KSD...

Effective weak and vague convergence of measures on the real line

We expand our effective framework for weak convergence of measures on th...

Majority Voting and the Condorcet's Jury Theorem

There is a striking relationship between a three hundred years old Polit...

Numerical analysis for coagulation-fragmentation equations with singular rates

This article deals with the convergence of finite volume scheme (FVS) fo...