Beyond Normal: On the Evaluation of Mutual Information Estimators

06/19/2023
by   Paweł Czyż, et al.
0

Mutual information is a general statistical dependency measure which has found applications in representation learning, causality, domain generalization and computational biology. However, mutual information estimators are typically evaluated on simple families of probability distributions, namely multivariate normal distribution and selected distributions with one-dimensional random variables. In this paper, we show how to construct a diverse family of distributions with known ground-truth mutual information and propose a language-independent benchmarking platform for mutual information estimators. We discuss the general applicability and limitations of classical and neural estimators in settings involving high dimensions, sparse interactions, long-tailed distributions, and high mutual information. Finally, we provide guidelines for practitioners on how to select appropriate estimator adapted to the difficulty of problem considered and issues one needs to consider when applying an estimator to a new data set.

READ FULL TEXT

page 5

page 24

page 26

page 27

page 28

page 32

research
06/13/2019

Factorized Mutual Information Maximization

We investigate the sets of joint probability distributions that maximize...
research
11/09/2020

Estimating Total Correlation with Mutual Information Bounds

Total correlation (TC) is a fundamental concept in information theory to...
research
02/24/2017

Nonparanormal Information Estimation

We study the problem of using i.i.d. samples from an unknown multivariat...
research
10/31/2022

A robust estimator of mutual information for deep learning interpretability

We develop the use of mutual information (MI), a well-established metric...
research
10/26/2018

Estimators for Multivariate Information Measures in General Probability Spaces

Information theoretic quantities play an important role in various setti...
research
05/27/2019

Practical and Consistent Estimation of f-Divergences

The estimation of an f-divergence between two probability distributions ...
research
07/10/2019

The Design of Mutual Information

We derive the functional form of mutual information (MI) from a set of d...

Please sign up or login with your details

Forgot password? Click here to reset