On Generalized Schürmann Entropy Estimators

11/18/2021
by   Peter Grassberger, et al.
0

We present a new class of estimators of Shannon entropy for severely undersampled discrete distributions. It is based on a generalization of an estimator proposed by T. Schuermann, which itself is a generalization of an estimator proposed by myself in arXiv:physics/0307138. For a special set of parameters they are completely free of bias and have a finite variance, something with is widely believed to be impossible. We present also detailed numerical tests where we compare them with other recent estimators and with exact results, and point out a clash with Bayesian estimators for mutual information.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/02/2022

Asymptotic Normality for Plug-in Estimators of Generalized Shannon's Entropy

Shannon's entropy is one of the building blocks of information theory an...
research
11/30/2021

Martingale product estimators for sensitivity analysis in computational statistical physics

We introduce a new class of estimators for the linear response of steady...
research
09/19/2021

Unifying Design-based Inference: A New Variance Estimation Principle

This paper presents two novel classes of variance estimators with superi...
research
07/25/2023

A unified class of null proportion estimators with plug-in FDR control

Since the work of <cit.>, it is well-known that the performance of the B...
research
10/23/2021

Generalized Resubstitution for Classification Error Estimation

We propose the family of generalized resubstitution classifier error est...
research
04/19/2023

Entropy Estimation via Uniformization

Entropy estimation is of practical importance in information theory and ...
research
10/13/2020

Entropy-based test for generalized Gaussian distributions

In this paper, we provide the proof of L^2 consistency for the kth neare...

Please sign up or login with your details

Forgot password? Click here to reset