DeepAI AI Chat
Log In Sign Up

Analysis of KNN Information Estimators for Smooth Distributions

by   Puning Zhao, et al.

KSG mutual information estimator, which is based on the distances of each sample to its k-th nearest neighbor, is widely used to estimate mutual information between two continuous random variables. Existing work has analyzed the convergence rate of this estimator for random variables whose densities are bounded away from zero in its support. In practice, however, KSG estimator also performs well for a much broader class of distributions, including not only those with bounded support and densities bounded away from zero, but also those with bounded support but densities approaching zero, and those with unbounded support. In this paper, we analyze the convergence rate of the error of KSG estimator for smooth distributions, whose support of density can be both bounded and unbounded. As KSG mutual information estimator can be viewed as an adaptive recombination of KL entropy estimators, in our analysis, we also provide convergence analysis of KL entropy estimator for a broad class of distributions.


page 1

page 2

page 3

page 4


Analysis of k-Nearest Neighbor Distances with Application to Entropy Estimation

Estimating entropy and mutual information consistently is important for ...

Exponential Concentration of a Density Functional Estimator

We analyze a plug-in estimator for a large class of integral functionals...

On the Estimation of Information Measures of Continuous Distributions

The estimation of information measures of continuous distributions based...

Inductive Mutual Information Estimation: A Convex Maximum-Entropy Copula Approach

We propose a novel estimator of the mutual information between two ordin...

On the Quadratic Decaying Property of the Information Rate Function

The quadratic decaying property of the information rate function states ...

Estimation of Rényi Entropy and Mutual Information Based on Generalized Nearest-Neighbor Graphs

We present simple and computationally efficient nonparametric estimators...

Demystifying Fixed k-Nearest Neighbor Information Estimators

Estimating mutual information from i.i.d. samples drawn from an unknown ...