Statistical estimation of the Shannon entropy

01/06/2018
by   Alexander Bulinski, et al.
0

The behavior of the Kozachenko - Leonenko estimates for the (differential) Shannon entropy is studied when the number of i.i.d. vector-valued observations tends to infinity. The asymptotic unbiasedness and L^2-consistency of the estimates are established. The conditions employed involve the analogues of the Hardy - Littlewood maximal function. It is shown that the results are valid in particular for the entropy estimation of any nondegenerate Gaussian vector.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/29/2019

Statistical estimation of the Kullback-Leibler divergence

Wide conditions are provided to guarantee asymptotic unbiasedness and L^...
research
07/12/2018

Shannon and Rényi entropy rates of stationary vector valued Gaussian random processes

We derive expressions for the Shannon and Rényi entropy rates of station...
research
04/23/2018

Statistical Estimation of Conditional Shannon Entropy

The new estimates of the conditional Shannon entropy are introduced in t...
research
12/17/2014

Consistency Analysis of an Empirical Minimum Error Entropy Algorithm

In this paper we study the consistency of an empirical minimum error ent...
research
07/13/2018

A combinatorial interpretation for Tsallis 2-entropy

While Shannon entropy is related to the growth rate of multinomial coeff...
research
05/24/2021

NPD Entropy: A Non-Parametric Differential Entropy Rate Estimator

The estimation of entropy rates for stationary discrete-valued stochasti...
research
05/09/2012

Improving Compressed Counting

Compressed Counting (CC) [22] was recently proposed for estimating the a...

Please sign up or login with your details

Forgot password? Click here to reset