On Neural Estimators for Conditional Mutual Information Using Nearest Neighbors Sampling

06/12/2020
by   Sina Molavipour, et al.
0

The estimation of mutual information (MI) or conditional mutual information (CMI) from a set of samples is a long-standing problem. A recent line of work in this area has leveraged the approximation power of artificial neural networks and has shown improvements over conventional methods. One important challenge in this new approach is the need to obtain, given the original dataset, a different set where the samples are distributed according to a specific product density function. This is particularly challenging when estimating CMI. In this paper, we introduce a new technique, based on k nearest neighbors (k-NN), to perform the resampling and derive high-confidence concentration bounds for the sample average. Then the technique is employed to train a neural network classifier and the CMI is estimated accordingly. We propose three estimators using this technique and prove their consistency, make a comparison between them and similar approaches in the literature, and experimentally show improvements in estimating the CMI in terms of accuracy and variance of the estimators.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/05/2020

DEMI: Discriminative Estimator of Mutual Information

Estimating mutual information between continuous random variables is oft...
research
10/13/2017

Potential Conditional Mutual Information: Estimators, Properties and Applications

The conditional mutual information I(X;Y|Z) measures the average informa...
research
12/06/2019

Conditional Mutual Information Estimation for Mixed Discrete and Continuous Variables with Nearest Neighbors

Fields like public health, public policy, and social science often want ...
research
05/31/2023

Variational f-Divergence and Derangements for Discriminative Mutual Information Estimation

The accurate estimation of the mutual information is a crucial task in v...
research
09/07/2016

Breaking the Bandwidth Barrier: Geometrical Adaptive Entropy Estimation

Estimators of information theoretic measures such as entropy and mutual ...
research
02/24/2017

Nonparanormal Information Estimation

We study the problem of using i.i.d. samples from an unknown multivariat...

Please sign up or login with your details

Forgot password? Click here to reset