Bounding the Error From Reference Set Kernel Maximum Mean Discrepancy

12/11/2018
by   Alexander Cloninger, et al.
0

In this paper, we bound the error induced by using a weighted skeletonization of two data sets for computing a two sample test with kernel maximum mean discrepancy. The error is quantified in terms of the speed in which heat diffuses from those points to the rest of the data, as well as how at the weights on the reference points are, and gives a non-asymptotic, non-probabilistic bound. The result ties into the problem of the eigenvector triple product, which appears in a number of important problems. The error bound also suggests an optimization scheme for choosing the best set of reference points and weights. The method is tested on a several two sample test examples.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset