Different coefficients for studying dependence

10/15/2021
by   Oona Rainio, et al.
0

Through computer simulations, we research several different measures of dependence, including Pearson's and Spearman's correlation coefficients, the maximal correlation, the distance correlation, a function of the mutual information called the information coefficient of correlation, and the maximal information coefficient (MIC). We compare how well these coefficients fulfill the criteria of generality, power, and equitability. Furthermore, we consider how the exact type of dependence, the amount of noise and the number of observations affect their performance.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 14

01/11/2019

On the Importance of Asymmetry and Monotonicity Constraints in Maximal Correlation Analysis

The maximal correlation coefficient is a well-established generalization...
09/23/2019

A new coefficient of correlation

Is it possible to define a coefficient of correlation which is (a) as si...
03/11/2019

Calibrating dependence between random elements

Attempts to quantify dependence between random elements X and Y via maxi...
03/20/2017

Copula Index for Detecting Dependence and Monotonicity between Stochastic Signals

This paper introduces a nonparametric copula-based approach for detectin...
06/01/2015

Mutual Dependence: A Novel Method for Computing Dependencies Between Random Variables

In data science, it is often required to estimate dependencies between d...
01/27/2013

Equitability Analysis of the Maximal Information Coefficient, with Comparisons

A measure of dependence is said to be equitable if it gives similar scor...
08/26/2013

The Generalized Mean Information Coefficient

Reshef & Reshef recently published a paper in which they present a metho...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.