Practical Bounds of Kullback-Leibler Divergence Using Maximum Mean Discrepancy

04/05/2022
by   Chong Xiao Wang, et al.
0

Estimating Kullback Leibler (KL) divergence from data samples is a strenuous task, with existing approaches either imposing restrictive assumptions on the domain knowledge of the underlying model or requiring density approximation like space partitioning. The use of kernel maximum mean discrepancy (MMD) yields an alternative non-parametric approach to compare two populations by finding the maximum mean difference generated by measurable functions in the unit ball in a reproducing kernel Hilbert space (RKHS). Predicated on the universal approximation of universal kernels, we propose two corresponding classes of functions in a RKHS that can bound the KL divergence from below and above, and derive the RKHS representations of these bounds. This allows us to develop asymptotically consistent estimates for the derived bounds. We evaluate the proposed bounds as mutual information proxies on an image dataset, and demonstrate that these bounds can stably track variations in mutual information.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/02/2019

Estimating Kullback-Leibler Divergence Using Kernel Machines

Recently, a method called the Mutual Information Neural Estimator (MINE)...
research
09/29/2021

Reliable Estimation of KL Divergence using a Discriminator in Reproducing Kernel Hilbert Space

Estimating Kullback Leibler (KL) divergence from samples of two distribu...
research
10/15/2021

Kernel Minimum Divergence Portfolios

Portfolio optimization is a key challenge in finance with the aim of cre...
research
02/25/2020

Reliable Estimation of Kullback-Leibler Divergence by Controlling Discriminator Complexity in the Reproducing Kernel Hilbert Space

Several scalable methods to compute the Kullback Leibler (KL) divergence...
research
06/30/2022

Learning Functions on Multiple Sets using Multi-Set Transformers

We propose a general deep architecture for learning functions on multipl...
research
06/16/2021

KALE Flow: A Relaxed KL Gradient Flow for Probabilities with Disjoint Support

We study the gradient flow for a relaxed approximation to the Kullback-L...
research
10/28/2019

Locality-Sensitive Hashing for f-Divergences: Mutual Information Loss and Beyond

Computing approximate nearest neighbors in high dimensional spaces is a ...

Please sign up or login with your details

Forgot password? Click here to reset