Statistical estimation of the Kullback-Leibler divergence

06/29/2019
by   Alexander Bulinski, et al.
0

Wide conditions are provided to guarantee asymptotic unbiasedness and L^2-consistency of the introduced estimates of the Kullback-Leibler divergence for probability measures in R^d having densities w.r.t. the Lebesgue measure. These estimates are constructed by means of two independent collections of i.i.d. observations and involve the specified k-nearest neighbor statistics. In particular, the established results are valid for estimates of the Kullback-Leibler divergence between any two Gaussian measures in R^d with nondegenerate covariance matrices. As a byproduct we obtain new statements concerning the Kozachenko-Leonenko estimators of the Shannon differential entropy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/23/2018

Statistical Estimation of Conditional Shannon Entropy

The new estimates of the conditional Shannon entropy are introduced in t...
research
01/06/2018

Statistical estimation of the Shannon entropy

The behavior of the Kozachenko - Leonenko estimates for the (differentia...
research
02/17/2017

Direct Estimation of Information Divergence Using Nearest Neighbor Ratios

We propose a direct estimation method for Rényi and f-divergence measure...
research
03/20/2019

Inequalities related to some types of entropies and divergences

The aim of this paper is to discuss new results concerning some kinds of...
research
05/25/2023

The Representation Jensen-Shannon Divergence

Statistical divergences quantify the difference between probability dist...
research
04/20/2023

Avoiding methane emission rate underestimates when using the divergence method

Methane is a powerful greenhouse gas, and a primary target for mitigatin...
research
11/01/2019

Update of a conditional probability by minimal divergence

The present paper investigates the situation that two events which are b...

Please sign up or login with your details

Forgot password? Click here to reset