Equitability, mutual information, and the maximal information coefficient

01/31/2013
by   Justin B. Kinney, et al.
0

Reshef et al. recently proposed a new statistical measure, the "maximal information coefficient" (MIC), for quantifying arbitrary dependencies between pairs of stochastic quantities. MIC is based on mutual information, a fundamental quantity in information theory that is widely understood to serve this need. MIC, however, is not an estimate of mutual information. Indeed, it was claimed that MIC possesses a desirable mathematical property called "equitability" that mutual information lacks. This was not proven; instead it was argued solely through the analysis of simulated data. Here we show that this claim, in fact, is incorrect. First we offer mathematical proof that no (non-trivial) dependence measure satisfies the definition of equitability proposed by Reshef et al.. We then propose a self-consistent and more general definition of equitability that follows naturally from the Data Processing Inequality. Mutual information satisfies this new definition of equitability while MIC does not. Finally, we show that the simulation evidence offered by Reshef et al. was artifactual. We conclude that estimating mutual information is not only practical for many real-world applications, but also provides a natural solution to the problem of quantifying associations in large data sets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/23/2021

On shared and multiple information

We address three outstanding problems in information theory. Problem one...
research
08/09/2021

A Bayesian Nonparametric Estimation of Mutual Information

Mutual information is a widely-used information theoretic measure to qua...
research
02/18/2020

The Mathematical Structure of Integrated Information Theory

Integrated Information Theory is one of the leading models of consciousn...
research
01/27/2013

Equitability Analysis of the Maximal Information Coefficient, with Comparisons

A measure of dependence is said to be equitable if it gives similar scor...
research
03/20/2017

Copula Index for Detecting Dependence and Monotonicity between Stochastic Signals

This paper introduces a nonparametric copula-based approach for detectin...
research
07/02/2021

Minimizing couplings in renormalization by preserving short-range mutual information

The connections between renormalization in statistical mechanics and inf...
research
02/01/2019

A copula-based measure for quantifying asymmetry in dependence and associations

Asymmetry is an inherent property of bivariate associations and therefor...

Please sign up or login with your details

Forgot password? Click here to reset