Analysis of KNN Information Estimators for Smooth Distributions

10/27/2018
by   Puning Zhao, et al.
0

KSG mutual information estimator, which is based on the distances of each sample to its k-th nearest neighbor, is widely used to estimate mutual information between two continuous random variables. Existing work has analyzed the convergence rate of this estimator for random variables whose densities are bounded away from zero in its support. In practice, however, KSG estimator also performs well for a much broader class of distributions, including not only those with bounded support and densities bounded away from zero, but also those with bounded support but densities approaching zero, and those with unbounded support. In this paper, we analyze the convergence rate of the error of KSG estimator for smooth distributions, whose support of density can be both bounded and unbounded. As KSG mutual information estimator can be viewed as an adaptive recombination of KL entropy estimators, in our analysis, we also provide convergence analysis of KL entropy estimator for a broad class of distributions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/28/2016

Analysis of k-Nearest Neighbor Distances with Application to Entropy Estimation

Estimating entropy and mutual information consistently is important for ...
research
03/28/2016

Exponential Concentration of a Density Functional Estimator

We analyze a plug-in estimator for a large class of integral functionals...
research
02/07/2020

On the Estimation of Information Measures of Continuous Distributions

The estimation of information measures of continuous distributions based...
research
02/25/2021

Inductive Mutual Information Estimation: A Convex Maximum-Entropy Copula Approach

We propose a novel estimator of the mutual information between two ordin...
research
08/27/2022

On the Quadratic Decaying Property of the Information Rate Function

The quadratic decaying property of the information rate function states ...
research
04/11/2016

Demystifying Fixed k-Nearest Neighbor Information Estimators

Estimating mutual information from i.i.d. samples drawn from an unknown ...
research
01/01/2018

Scalable Hash-Based Estimation of Divergence Measures

We propose a scalable divergence estimation method based on hashing. Con...

Please sign up or login with your details

Forgot password? Click here to reset