Equivalence Principle of the P-value and Mutual Information

08/22/2023
by   Tsutomu Mori, et al.
0

In this paper, we propose a novel equivalence between probability theory and information theory. For a single random variable, Shannon's self-information, I=-logp, is an alternative expression of a probability p. However, for two random variables, no information equivalent to the p-value has been identified. Here, we prove theorems that demonstrate that mutual information (MI) is equivalent to the p-value irrespective of prior information about the distribution of the variables. If the maximum entropy principle can be applied, our equivalence theorems allow us to readily compute the p-value from multidimensional MI. By contrast, in a contingency table of any size with known marginal frequencies, our theorem states that MI asymptotically coincides with the logarithm of the p-value of Fisher's exact test, divided by the sample size. Accordingly, the theorems enable us to perform a meta-analysis to accurately estimate MI with a low p-value, thereby calculating informational interdependence that is robust against sample size variation. Thus, our theorems demonstrate the equivalence of the p-value and MI at every dimension, use the merits of both, and provide fundamental information for integrating probability theory and information theory.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/15/2019

Non parametric estimation of Joint entropy and Shannon mutual information, Asymptotic limits: Application to statistic tests

This paper proposes a new method for estimating the joint probability ma...
research
06/10/2019

Formalization of the Axiom of Choice and its Equivalent Theorems

In this paper, we describe the formalization of the axiom of choice and ...
research
05/12/2020

Strong Asymptotic Composition Theorems for Sibson Mutual Information

We characterize the growth of the Sibson mutual information, of any orde...
research
06/27/2012

Ranking by Dependence - A Fair Criteria

Estimating the dependences between random variables, and ranking them ac...
research
02/18/2022

Information Decomposition Diagrams Applied beyond Shannon Entropy: A Generalization of Hu's Theorem

In information theory, one major goal is to find useful functions that s...
research
02/07/2017

Trimming the Independent Fat: Sufficient Statistics, Mutual Information, and Predictability from Effective Channel States

One of the most fundamental questions one can ask about a pair of random...
research
01/12/2022

Theoretical Limits of Joint Detection and Estimation for Radar Target

This paper proposes a joint detection and estimation (JDE) scheme based ...

Please sign up or login with your details

Forgot password? Click here to reset