Investigation of Alternative Measures for Mutual Information

02/02/2022
by   Bulut Kuskonmaz, et al.
0

Mutual information I(X;Y) is a useful definition in information theory to estimate how much information the random variable Y holds about the random variable X. One way to define the mutual information is by comparing the joint distribution of X and Y with the product of the marginals through the KL-divergence. If the two distributions are close to each other there will be almost no leakage of X from Y since the two variables are close to being independent. In the discrete setting the mutual information has the nice interpretation of how many bits Y reveals about X and if I(X;Y)=H(X) (the Shannon entropy of X) then X is completely revealed. However, in the continuous case we do not have the same reasoning. For instance the mutual information can be infinite in the continuous case. This fact enables us to try different metrics or divergences to define the mutual information. In this paper, we are evaluating different metrics or divergences such as Kullback-Liebler (KL) divergence, Wasserstein distance, Jensen-Shannon divergence and total variation distance to form alternatives to the mutual information in the continuous case. We deploy different methods to estimate or bound these metrics and divergences and evaluate their performances.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/11/2019

Generalized Mutual Information

Mutual information is one of the essential building blocks of informatio...
research
03/04/2019

Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding

Although Shannon mutual information has been widely used, its effective ...
research
08/10/2021

Using Information Theory to Measure Psychophysical Performance

Most psychophysical experiments discard half the data collected. Specifi...
research
08/04/2018

Implementation and Analysis of Stable PUFs Using Gate Oxide Breakdown

We implement and analyze highly stable PUFs using two random gate oxide ...
research
02/04/2021

Cumulant Expansion of Mutual Information for Quantifying Leakage of a Protected Secret

The information leakage of a cryptographic implementation with a given d...
research
09/08/2018

Hybrid Statistical Estimation of Mutual Information and its Application to Information Flow

Analysis of a probabilistic system often requires to learn the joint pro...
research
10/28/2019

Locality-Sensitive Hashing for f-Divergences: Mutual Information Loss and Beyond

Computing approximate nearest neighbors in high dimensional spaces is a ...

Please sign up or login with your details

Forgot password? Click here to reset