
Different coefficients for studying dependence
Through computer simulations, we research several different measures of ...
read it

A MultiWay Correlation Coefficient
Pearson's correlation is an important summary measure of the amount of d...
read it

A new correlation coefficient between categorical, ordinal and interval variables with Pearson characteristics
A prescription is presented for a new and practical correlation coeffici...
read it

A causation coefficient and taxonomy of correlation/causation relationships
This paper introduces a causation coefficient which is defined in terms ...
read it

Transport Dependency: Optimal Transport Based Dependency Measures
Finding meaningful ways to determine the dependency between two random v...
read it

A correlation coefficient of belief functions
How to manage conflict is still an open issue in DempsterShafer evidenc...
read it

Multirater delta: extending the delta nominal measure of agreement between two raters to many raters
The need to measure the degree of agreement among R raters who independe...
read it
A new coefficient of correlation
Is it possible to define a coefficient of correlation which is (a) as simple as the classical coefficients like Pearson's correlation or Spearman's correlation, and yet (b) consistently estimates some simple and interpretable measure of the degree of dependence between the variables, which is 0 if and only if the variables are independent and 1 if and only if one is a measurable function of the other, and (c) has a simple asymptotic theory under the hypothesis of independence, like the classical coefficients? This article answers this question in the affirmative, by producing such a coefficient. No assumptions are needed on the distributions of the variables. There are several coefficients in the literature that converge to 0 if and only if the variables are independent, but none that satisfy any of the other properties mentioned above.
READ FULL TEXT
Comments
There are no comments yet.