Reliable Estimation of KL Divergence using a Discriminator in Reproducing Kernel Hilbert Space

09/29/2021
by   Sandesh Ghimire, et al.
0

Estimating Kullback Leibler (KL) divergence from samples of two distributions is essential in many machine learning problems. Variational methods using neural network discriminator have been proposed to achieve this task in a scalable manner. However, we noted that most of these methods using neural network discriminators suffer from high fluctuations (variance) in estimates and instability in training. In this paper, we look at this issue from statistical learning theory and function space complexity perspective to understand why this happens and how to solve it. We argue that the cause of these pathologies is lack of control over the complexity of the neural network discriminator function and could be mitigated by controlling it. To achieve this objective, we 1) present a novel construction of the discriminator in the Reproducing Kernel Hilbert Space (RKHS), 2) theoretically relate the error probability bound of the KL estimates to the complexity of the discriminator in the RKHS space, 3) present a scalable way to control the complexity (RKHS norm) of the discriminator for a reliable estimation of KL divergence, and 4) prove the consistency of the proposed estimator. In three different applications of KL divergence : estimation of KL, estimation of mutual information and Variational Bayes, we show that by controlling the complexity as developed in the theory, we are able to reduce the variance of KL estimates and stabilize the training

READ FULL TEXT
research
02/25/2020

Reliable Estimation of Kullback-Leibler Divergence by Controlling Discriminator Complexity in the Reproducing Kernel Hilbert Space

Several scalable methods to compute the Kullback Leibler (KL) divergence...
research
05/02/2019

Estimating Kullback-Leibler Divergence Using Kernel Machines

Recently, a method called the Mutual Information Neural Estimator (MINE)...
research
04/05/2022

Practical Bounds of Kullback-Leibler Divergence Using Maximum Mean Discrepancy

Estimating Kullback Leibler (KL) divergence from data samples is a stren...
research
04/14/2021

Deep Data Density Estimation through Donsker-Varadhan Representation

Estimating the data density is one of the challenging problems in deep l...
research
11/17/2020

Reducing the Variance of Variational Estimates of Mutual Information by Limiting the Critic's Hypothesis Space to RKHS

Mutual information (MI) is an information-theoretic measure of dependenc...
research
01/17/2021

Measure-conditional Discriminator with Stationary Optimum for GANs and Statistical Distance Surrogates

We propose a simple but effective modification of the discriminators, na...
research
12/16/2019

ITENE: Intrinsic Transfer Entropy Neural Estimator

Quantifying the directionality of information flow is instrumental in un...

Please sign up or login with your details

Forgot password? Click here to reset