On the Estimation of Information Measures of Continuous Distributions

02/07/2020
by   Georg Pichler, et al.
0

The estimation of information measures of continuous distributions based on samples is a fundamental problem in statistics and machine learning. In this paper, we analyze estimates of differential entropy in K-dimensional Euclidean space, computed from a finite number of samples, when the probability density function belongs to a predetermined convex family P. First, estimating differential entropy to any accuracy is shown to be infeasible if the differential entropy of densities in P is unbounded, clearly showing the necessity of additional assumptions. Subsequently, we investigate sufficient conditions that enable confidence bounds for the estimation of differential entropy. In particular, we provide confidence bounds for simple histogram based estimation of differential entropy from a fixed number of samples, assuming that the probability density function is Lipschitz continuous with known Lipschitz constant and known, bounded support. Our focus is on differential entropy, but we provide examples that show that similar results hold for mutual information and relative entropy as well.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/27/2018

Analysis of KNN Information Estimators for Smooth Distributions

KSG mutual information estimator, which is based on the distances of eac...
research
03/19/2021

Stochastic comparisons, differential entropy and varentropy for distributions induced by probability density functions

Stimulated by the need of describing useful notions related to informati...
research
12/12/2021

Optimal Partitions for Nonparametric Multivariate Entropy Estimation

Efficient and accurate estimation of multivariate empirical probability ...
research
05/23/2018

Determining the Number of Samples Required to Estimate Entropy in Natural Sequences

Calculating the Shannon entropy for symbolic sequences has been widely c...
research
11/19/2014

Unification of field theory and maximum entropy methods for learning probability densities

The need to estimate smooth probability distributions (a.k.a. probabilit...
research
11/14/2019

Estimating differential entropy using recursive copula splitting

A method for estimating the Shannon differential entropy of multidimensi...
research
05/27/2019

Practical and Consistent Estimation of f-Divergences

The estimation of an f-divergence between two probability distributions ...

Please sign up or login with your details

Forgot password? Click here to reset