Non parametric estimation of Joint entropy and Shannon mutual information, Asymptotic limits: Application to statistic tests

06/15/2019
by   Amadou Diadie Ba, et al.
0

This paper proposes a new method for estimating the joint probability mass function of a pair of discrete random variables. This estimator is used to construct joint entropy and Shannon mutual information estimates of a pair of discrete random variables. Almost sure consistency and central limit Theorems are established. Theorical results are validated by simulations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/16/2020

Conditional Shannon, Réyni, and Tsallis entropies estimation and asymptotic limits: discrete case

A method of estimating the joint probability mass function of a pair of ...
research
12/27/2018

On mutual information estimation for mixed-pair random variables

We study the mutual information estimation for mixed-pair random variabl...
research
08/28/2021

An axiomatic characterization of mutual information

We characterize mutual information as the unique map on ordered pairs of...
research
09/08/2018

Hybrid Statistical Estimation of Mutual Information and its Application to Information Flow

Analysis of a probabilistic system often requires to learn the joint pro...
research
08/22/2023

Equivalence Principle of the P-value and Mutual Information

In this paper, we propose a novel equivalence between probability theory...
research
07/14/2023

A Poisson Decomposition for Information and the Information-Event Diagram

Information diagram and the I-measure are useful mnemonics where random ...
research
02/18/2022

Information Decomposition Diagrams Applied beyond Shannon Entropy: A Generalization of Hu's Theorem

In information theory, one major goal is to find useful functions that s...

Please sign up or login with your details

Forgot password? Click here to reset