Neural Entropic Estimation: A faster path to mutual information estimation

05/30/2019
by   Chan Chung, et al.
0

We point out a limitation of the mutual information neural estimation (MINE) where the network fails to learn at the initial training phase, leading to slow convergence in the number of training iterations. To solve this problem, we propose a faster method called the mutual information neural entropic estimation (MI-NEE). Our solution first generalizes MINE to estimate the entropy using a custom reference distribution. The entropy estimate can then be used to estimate the mutual information. We argue that the seemingly redundant intermediate step of entropy estimation allows one to improve the convergence by an appropriate reference distribution. In particular, we show that MI-NEE reduces to MINE in the special case when the reference distribution is the product of marginal distributions, but faster convergence is possible by choosing the uniform distribution as the reference distribution instead. Compared to the product of marginals, the uniform distribution introduces more samples in low-density regions and fewer samples in high-density regions, which appear to lead to an overall larger gradient for faster convergence.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/06/2019

Estimating the Mutual Information between two Discrete, Asymmetric Variables with Limited Samples

Determining the strength of non-linear statistical dependencies between ...
research
01/22/2018

Convexity of mutual information along the heat flow

We study the convexity of mutual information along the evolution of the ...
research
09/05/2019

LSMI-Sinkhorn: Semi-supervised Squared-Loss Mutual Information Estimation with Optimal Transport

Estimating mutual information is an important machine learning and stati...
research
06/10/2019

HTDet: A Clustering Method using Information Entropy for Hardware Trojan Detection

Hardware Trojans (HTs) have drawn more and more attention in both academ...
research
02/25/2021

Inductive Mutual Information Estimation: A Convex Maximum-Entropy Copula Approach

We propose a novel estimator of the mutual information between two ordin...
research
05/27/2019

Practical and Consistent Estimation of f-Divergences

The estimation of an f-divergence between two probability distributions ...
research
12/11/2002

The structure of evolutionary exploration: On crossover, buildings blocks and Estimation-Of-Distribution Algorithms

The notion of building blocks can be related to the structure of the off...

Please sign up or login with your details

Forgot password? Click here to reset