DeepAI AI Chat
Log In Sign Up

DEMI: Discriminative Estimator of Mutual Information

by   Ruizhi Liao, et al.

Estimating mutual information between continuous random variables is often intractable and extremely challenging for high-dimensional data. Recent progress has leveraged neural networks to optimize variational lower bounds on mutual information. Although showing promise for this difficult problem, the variational methods have been theoretically and empirically proven to have serious statistical limitations: 1) most of the approaches cannot make accurate estimates when the underlying mutual information is either low or high; 2) the resulting estimators may suffer from high variance. Our approach is based on training a classifier that provides the probability whether a data sample pair is drawn from the joint distribution or from the product of its marginal distributions. We use this probabilistic prediction to estimate mutual information. We show theoretically that our method and other variational approaches are equivalent when they achieve their optimum, while our approach does not optimize a variational bound. Empirical results demonstrate high accuracy and a good bias/variance tradeoff using our approach.


page 1

page 2

page 3

page 4


Understanding the Limitations of Variational Mutual Information Estimators

Variational approaches based on neural networks are showing promise for ...

On Variational Bounds of Mutual Information

Estimating and optimizing Mutual Information (MI) is core to many proble...

On Neural Estimators for Conditional Mutual Information Using Nearest Neighbors Sampling

The estimation of mutual information (MI) or conditional mutual informat...

Estimating Mutual Information via Geodesic kNN

Estimating mutual information (MI) between two continuous random variabl...

Estimating mutual information in high dimensions via classification error

Multivariate pattern analyses approaches in neuroimaging are fundamental...

Data-Efficient Mutual Information Neural Estimator

Measuring Mutual Information (MI) between high-dimensional, continuous, ...