Reliable Estimation of Kullback-Leibler Divergence by Controlling Discriminator Complexity in the Reproducing Kernel Hilbert Space

02/25/2020
by   Sandesh Ghimire, et al.
14

Several scalable methods to compute the Kullback Leibler (KL) divergence between two distributions using their samples have been proposed and applied in large-scale machine learning models. While they have been found to be unstable, the theoretical root cause of the problem is not clear. In this paper, we study in detail a generative adversarial network based approach that uses a neural network discriminator to estimate KL divergence. We argue that, in such case, high fluctuations in the estimates are a consequence of not controlling the complexity of the discriminator function space. We provide a theoretical underpinning and remedy for this problem through the following contributions. First, we construct a discriminator in the Reproducing Kernel Hilbert Space (RKHS). This enables us to leverage sample complexity and mean embedding to theoretically relate the error probability bound of the KL estimates to the complexity of the neural-net discriminator. Based on this theory, we then present a scalable way to control the complexity of the discriminator for a consistent estimation of KL divergence. We support both our proposed theory and method to control the complexity of the RKHS discriminator in controlled experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/29/2021

Reliable Estimation of KL Divergence using a Discriminator in Reproducing Kernel Hilbert Space

Estimating Kullback Leibler (KL) divergence from samples of two distribu...
research
05/02/2019

Estimating Kullback-Leibler Divergence Using Kernel Machines

Recently, a method called the Mutual Information Neural Estimator (MINE)...
research
04/05/2022

Practical Bounds of Kullback-Leibler Divergence Using Maximum Mean Discrepancy

Estimating Kullback Leibler (KL) divergence from data samples is a stren...
research
02/28/2022

KL Divergence Estimation with Multi-group Attribution

Estimating the Kullback-Leibler (KL) divergence between two distribution...
research
07/31/2020

Deep Direct Likelihood Knockoffs

Predictive modeling often uses black box machine learning methods, such ...
research
06/16/2021

KALE Flow: A Relaxed KL Gradient Flow for Probabilities with Disjoint Support

We study the gradient flow for a relaxed approximation to the Kullback-L...
research
01/17/2021

Measure-conditional Discriminator with Stationary Optimum for GANs and Statistical Distance Surrogates

We propose a simple but effective modification of the discriminators, na...

Please sign up or login with your details

Forgot password? Click here to reset