DEMI: Discriminative Estimator of Mutual Information

10/05/2020
by   Ruizhi Liao, et al.
0

Estimating mutual information between continuous random variables is often intractable and extremely challenging for high-dimensional data. Recent progress has leveraged neural networks to optimize variational lower bounds on mutual information. Although showing promise for this difficult problem, the variational methods have been theoretically and empirically proven to have serious statistical limitations: 1) most of the approaches cannot make accurate estimates when the underlying mutual information is either low or high; 2) the resulting estimators may suffer from high variance. Our approach is based on training a classifier that provides the probability whether a data sample pair is drawn from the joint distribution or from the product of its marginal distributions. We use this probabilistic prediction to estimate mutual information. We show theoretically that our method and other variational approaches are equivalent when they achieve their optimum, while our approach does not optimize a variational bound. Empirical results demonstrate high accuracy and a good bias/variance tradeoff using our approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/14/2019

Understanding the Limitations of Variational Mutual Information Estimators

Variational approaches based on neural networks are showing promise for ...
research
05/31/2023

Variational f-Divergence and Derangements for Discriminative Mutual Information Estimation

The accurate estimation of the mutual information is a crucial task in v...
research
05/16/2019

On Variational Bounds of Mutual Information

Estimating and optimizing Mutual Information (MI) is core to many proble...
research
06/01/2023

On the Effectiveness of Hybrid Mutual Information Estimation

Estimating the mutual information from samples from a joint distribution...
research
11/17/2020

Reducing the Variance of Variational Estimates of Mutual Information by Limiting the Critic's Hypothesis Space to RKHS

Mutual information (MI) is an information-theoretic measure of dependenc...
research
06/12/2020

On Neural Estimators for Conditional Mutual Information Using Nearest Neighbors Sampling

The estimation of mutual information (MI) or conditional mutual informat...
research
10/26/2021

Estimating Mutual Information via Geodesic kNN

Estimating mutual information (MI) between two continuous random variabl...

Please sign up or login with your details

Forgot password? Click here to reset