DeepAI AI Chat
Log In Sign Up

On Generalized Schürmann Entropy Estimators

11/18/2021
by   Peter Grassberger, et al.
0

We present a new class of estimators of Shannon entropy for severely undersampled discrete distributions. It is based on a generalization of an estimator proposed by T. Schuermann, which itself is a generalization of an estimator proposed by myself in arXiv:physics/0307138. For a special set of parameters they are completely free of bias and have a finite variance, something with is widely believed to be impossible. We present also detailed numerical tests where we compare them with other recent estimators and with exact results, and point out a clash with Bayesian estimators for mutual information.

READ FULL TEXT

page 1

page 2

page 3

page 4

05/02/2022

Asymptotic Normality for Plug-in Estimators of Generalized Shannon's Entropy

Shannon's entropy is one of the building blocks of information theory an...
11/30/2021

Martingale product estimators for sensitivity analysis in computational statistical physics

We introduce a new class of estimators for the linear response of steady...
09/19/2021

Unifying Design-based Inference: A New Variance Estimation Principle

This paper presents two novel classes of variance estimators with superi...
10/14/2019

Understanding the Limitations of Variational Mutual Information Estimators

Variational approaches based on neural networks are showing promise for ...
10/27/2018

Analysis of KNN Information Estimators for Smooth Distributions

KSG mutual information estimator, which is based on the distances of eac...
10/23/2021

Generalized Resubstitution for Classification Error Estimation

We propose the family of generalized resubstitution classifier error est...
04/30/2020

Bias-corrected estimator for intrinsic dimension and differential entropy–a visual multiscale approach

Intrinsic dimension and differential entropy estimators are studied in t...