Geometric k-nearest neighbor estimation of entropy and mutual information

11/02/2017
by   Warren M. Lord, et al.
0

Like most nonparametric estimators of information functionals involving continuous multidimensional random variables, the k-nearest neighbors (knn) estimators involve an estimate of the probability density functions (pdfs) of the variables. The pdfs are estimated using spheres in an appropriate norm to represent local volumes. We introduce a new class of knn estimators that we call geometric knn estimators (g-kNN), which use more complex local volume elements to better model the local geometry of the probability measures. As an example of this class of estimators, we develop a g-kNN estimator of entropy and mutual information based on elliptical volume elements, capturing the local stretching and compression common to a wide range of dynamical systems attractors. There is a trade-off between the amount of local data needed to fit a more complicated local volume element and the improvement in the estimate due to the better description of the local geometry. In a series of numerical examples, this g-kNN estimator of mutual information is compared to the Kraskov-Stögbauer-Grassberger (KSG) estimator, where we find that the modelling of the local geometry pays off in terms of better estimates, both when the joint distribution is thinly supported, and when sample sizes are small. In particular, the examples suggest that the g-kNN estimators can be of particular relevance to applications in which the system is large but data size is limited.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/07/2014

Efficient Estimation of Mutual Information for Strongly Dependent Variables

We demonstrate that a popular class of nonparametric mutual information ...
research
01/09/2018

A Local Approach for Information Transfer

In this work, a strategy to estimate the information transfer between th...
research
09/07/2016

Breaking the Bandwidth Barrier: Geometrical Adaptive Entropy Estimation

Estimators of information theoretic measures such as entropy and mutual ...
research
03/09/2010

Estimation of Rényi Entropy and Mutual Information Based on Generalized Nearest-Neighbor Graphs

We present simple and computationally efficient nonparametric estimators...
research
12/20/2022

fastMI: a fast and consistent copula-based estimator of mutual information

As a fundamental concept in information theory, mutual information (MI) ...
research
02/24/2017

Nonparanormal Information Estimation

We study the problem of using i.i.d. samples from an unknown multivariat...
research
11/17/2017

Nonparametric independence testing via mutual information

We propose a test of independence of two multivariate random vectors, gi...

Please sign up or login with your details

Forgot password? Click here to reset