On the consistency of the Kozachenko-Leonenko entropy estimate

02/25/2021
by   Luc Devroye, et al.
0

We revisit the problem of the estimation of the differential entropy H(f) of a random vector X in R^d with density f, assuming that H(f) exists and is finite. In this note, we study the consistency of the popular nearest neighbor estimate H_n of Kozachenko and Leonenko. Without any smoothness condition we show that the estimate is consistent (E{|H_n - H(f)|}→ 0 as n →∞) if and only if 𝔼{log ( X + 1 )} < ∞. Furthermore, if X has compact support, then H_n → H(f) almost surely.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/27/2021

Nearest neighbor process: weak convergence and non-asymptotic bound

An empirical measure that results from the nearest neighbors to a given ...
research
06/05/2016

Finite-Sample Analysis of Fixed-k Nearest Neighbor Density Functional Estimators

We provide finite-sample analysis of a general framework for using k-nea...
research
11/04/2014

Classification with the nearest neighbor rule in general finite dimensional spaces: necessary and sufficient conditions

Given an n-sample of random vectors (X_i,Y_i)_1 ≤ i ≤ n whose joint law ...
research
11/23/2017

The Nearest Neighbor Information Estimator is Adaptively Near Minimax Rate-Optimal

We analyze the Kozachenko--Leonenko (KL) nearest neighbor estimator for ...
research
04/23/2018

Statistical Estimation of Conditional Shannon Entropy

The new estimates of the conditional Shannon entropy are introduced in t...
research
11/19/2014

Unification of field theory and maximum entropy methods for learning probability densities

The need to estimate smooth probability distributions (a.k.a. probabilit...
research
10/27/2018

Estimating Differential Entropy under Gaussian Convolutions

This paper studies the problem of estimating the differential entropy h(...

Please sign up or login with your details

Forgot password? Click here to reset