Survey On The Estimation Of Mutual Information Methods as a Measure of Dependency Versus Correlation Analysis

01/14/2014
by   D. Gencaga, et al.
0

In this survey, we present and compare different approaches to estimate Mutual Information (MI) from data to analyse general dependencies between variables of interest in a system. We demonstrate the performance difference of MI versus correlation analysis, which is only optimal in case of linear dependencies. First, we use a piece-wise constant Bayesian methodology using a general Dirichlet prior. In this estimation method, we use a two-stage approach where we approximate the probability distribution first and then calculate the marginal and joint entropies. Here, we demonstrate the performance of this Bayesian approach versus the others for computing the dependency between different variables. We also compare these with linear correlation analysis. Finally, we apply MI and correlation analysis to the identification of the bias in the determination of the aerosol optical depth (AOD) by the satellite based Moderate Resolution Imaging Spectroradiometer (MODIS) and the ground based AErosol RObotic NETwork (AERONET). Here, we observe that the AOD measurements by these two instruments might be different for the same location. The reason of this bias is explored by quantifying the dependencies between the bias and 15 other variables including cloud cover, surface reflectivity and others.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

05/06/2019

Estimating the Mutual Information between two Discrete, Asymmetric Variables with Limited Samples

Determining the strength of non-linear statistical dependencies between ...
02/13/2013

Towards Identification of Relevant Variables in the observed Aerosol Optical Depth Bias between MODIS and AERONET observations

Measurements made by satellite remote sensing, Moderate Resolution Imagi...
06/01/2015

Mutual Dependence: A Novel Method for Computing Dependencies Between Random Variables

In data science, it is often required to estimate dependencies between d...
08/11/2021

Learning Bias-Invariant Representation by Cross-Sample Mutual Information Minimization

Deep learning algorithms mine knowledge from the training data and thus ...
07/30/2015

Information-theoretical analysis of the statistical dependencies among three variables: Applications to written language

We develop the information-theoretical concepts required to study the st...
07/09/2013

Controlling the Precision-Recall Tradeoff in Differential Dependency Network Analysis

Graphical models have gained a lot of attention recently as a tool for l...
09/08/2018

Hybrid Statistical Estimation of Mutual Information and its Application to Information Flow

Analysis of a probabilistic system often requires to learn the joint pro...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.