Different coefficients for studying dependence

10/15/2021
by   Oona Rainio, et al.
0

Through computer simulations, we research several different measures of dependence, including Pearson's and Spearman's correlation coefficients, the maximal correlation, the distance correlation, a function of the mutual information called the information coefficient of correlation, and the maximal information coefficient (MIC). We compare how well these coefficients fulfill the criteria of generality, power, and equitability. Furthermore, we consider how the exact type of dependence, the amount of noise and the number of observations affect their performance.

READ FULL TEXT
research
01/11/2019

On the Importance of Asymmetry and Monotonicity Constraints in Maximal Correlation Analysis

The maximal correlation coefficient is a well-established generalization...
research
09/23/2019

A new coefficient of correlation

Is it possible to define a coefficient of correlation which is (a) as si...
research
03/11/2019

Calibrating dependence between random elements

Attempts to quantify dependence between random elements X and Y via maxi...
research
03/20/2017

Copula Index for Detecting Dependence and Monotonicity between Stochastic Signals

This paper introduces a nonparametric copula-based approach for detectin...
research
01/27/2013

Equitability Analysis of the Maximal Information Coefficient, with Comparisons

A measure of dependence is said to be equitable if it gives similar scor...
research
08/26/2013

The Generalized Mean Information Coefficient

Reshef & Reshef recently published a paper in which they present a metho...
research
11/30/2022

Asymmetric Dependence Measurement and Testing

Measuring the (causal) direction and strength of dependence between two ...

Please sign up or login with your details

Forgot password? Click here to reset