Survey On The Estimation Of Mutual Information Methods as a Measure of Dependency Versus Correlation Analysis

by   D. Gencaga, et al.

In this survey, we present and compare different approaches to estimate Mutual Information (MI) from data to analyse general dependencies between variables of interest in a system. We demonstrate the performance difference of MI versus correlation analysis, which is only optimal in case of linear dependencies. First, we use a piece-wise constant Bayesian methodology using a general Dirichlet prior. In this estimation method, we use a two-stage approach where we approximate the probability distribution first and then calculate the marginal and joint entropies. Here, we demonstrate the performance of this Bayesian approach versus the others for computing the dependency between different variables. We also compare these with linear correlation analysis. Finally, we apply MI and correlation analysis to the identification of the bias in the determination of the aerosol optical depth (AOD) by the satellite based Moderate Resolution Imaging Spectroradiometer (MODIS) and the ground based AErosol RObotic NETwork (AERONET). Here, we observe that the AOD measurements by these two instruments might be different for the same location. The reason of this bias is explored by quantifying the dependencies between the bias and 15 other variables including cloud cover, surface reflectivity and others.



There are no comments yet.


page 1

page 2

page 3

page 4


Estimating the Mutual Information between two Discrete, Asymmetric Variables with Limited Samples

Determining the strength of non-linear statistical dependencies between ...

Towards Identification of Relevant Variables in the observed Aerosol Optical Depth Bias between MODIS and AERONET observations

Measurements made by satellite remote sensing, Moderate Resolution Imagi...

Mutual Dependence: A Novel Method for Computing Dependencies Between Random Variables

In data science, it is often required to estimate dependencies between d...

Learning Bias-Invariant Representation by Cross-Sample Mutual Information Minimization

Deep learning algorithms mine knowledge from the training data and thus ...

Information-theoretical analysis of the statistical dependencies among three variables: Applications to written language

We develop the information-theoretical concepts required to study the st...

Controlling the Precision-Recall Tradeoff in Differential Dependency Network Analysis

Graphical models have gained a lot of attention recently as a tool for l...

Hybrid Statistical Estimation of Mutual Information and its Application to Information Flow

Analysis of a probabilistic system often requires to learn the joint pro...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.