DeepAI AI Chat
Log In Sign Up

On Generalized Schürmann Entropy Estimators

by   Peter Grassberger, et al.

We present a new class of estimators of Shannon entropy for severely undersampled discrete distributions. It is based on a generalization of an estimator proposed by T. Schuermann, which itself is a generalization of an estimator proposed by myself in arXiv:physics/0307138. For a special set of parameters they are completely free of bias and have a finite variance, something with is widely believed to be impossible. We present also detailed numerical tests where we compare them with other recent estimators and with exact results, and point out a clash with Bayesian estimators for mutual information.


page 1

page 2

page 3

page 4


Asymptotic Normality for Plug-in Estimators of Generalized Shannon's Entropy

Shannon's entropy is one of the building blocks of information theory an...

Martingale product estimators for sensitivity analysis in computational statistical physics

We introduce a new class of estimators for the linear response of steady...

Unifying Design-based Inference: A New Variance Estimation Principle

This paper presents two novel classes of variance estimators with superi...

Understanding the Limitations of Variational Mutual Information Estimators

Variational approaches based on neural networks are showing promise for ...

Analysis of KNN Information Estimators for Smooth Distributions

KSG mutual information estimator, which is based on the distances of eac...

Generalized Resubstitution for Classification Error Estimation

We propose the family of generalized resubstitution classifier error est...

Bias-corrected estimator for intrinsic dimension and differential entropy–a visual multiscale approach

Intrinsic dimension and differential entropy estimators are studied in t...