Conditional Shannon, Réyni, and Tsallis entropies estimation and asymptotic limits: discrete case

02/16/2020
by   Ba Amadou Diadie, et al.
0

A method of estimating the joint probability mass function of a pair of discrete random variables is described. This estimator is used to construct the conditional Shannon-Réyni-Tsallis entropies estimates. From there almost sure rates of convergence and asymptotic normality are established. The theorical results are validated by simulations.

READ FULL TEXT
research
06/15/2019

Non parametric estimation of Joint entropy and Shannon mutual information, Asymptotic limits: Application to statistic tests

This paper proposes a new method for estimating the joint probability ma...
research
04/23/2018

Statistical Estimation of Conditional Shannon Entropy

The new estimates of the conditional Shannon entropy are introduced in t...
research
02/21/2022

Asymptotic properties of the normalized discrete associated-kernel estimator for probability mass function

Discrete kernel smoothing is now gaining importance in nonparametric sta...
research
05/21/2023

Multi-scale information content measurement method based on Shannon information

In this paper, we present a new multi-scale information content calculat...
research
02/10/2016

Conditional Dependence via Shannon Capacity: Axioms, Estimators and Applications

We conduct an axiomatic study of the problem of estimating the strength ...
research
07/09/2023

Copula-like inference for discrete bivariate distributions with rectangular support

After reviewing a large body of literature on the modeling of bivariate ...
research
08/19/2018

Non-Asymptotic and Asymptotic Fundamental Limits of Guessing Subject to Distortion

This paper considers the problem of guessing random variables subject to...

Please sign up or login with your details

Forgot password? Click here to reset