Finite-Sample Analysis of Fixed-k Nearest Neighbor Density Functional Estimators

06/05/2016
by   Shashank Singh, et al.
0

We provide finite-sample analysis of a general framework for using k-nearest neighbor statistics to estimate functionals of a nonparametric continuous probability density, including entropies and divergences. Rather than plugging a consistent density estimate (which requires k →∞ as the sample size n →∞) into the functional of interest, the estimators we consider fix k and perform a bias correction. This is more efficient computationally, and, as we show in certain cases, statistically, leading to faster convergence rates. Our framework unifies several previous estimators, for most of which ours are the first finite sample guarantees.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/22/2018

Nearest neighbor density functional estimation based on inverse Laplace transform

A general approach to L_2-consistent estimation of various density funct...
research
03/09/2010

Estimation of Rényi Entropy and Mutual Information Based on Generalized Nearest-Neighbor Graphs

We present simple and computationally efficient nonparametric estimators...
research
05/03/2011

Pruning nearest neighbor cluster trees

Nearest neighbor (k-NN) graphs are widely used in machine learning and d...
research
02/25/2021

On the consistency of the Kozachenko-Leonenko entropy estimate

We revisit the problem of the estimation of the differential entropy H(f...
research
02/12/2018

Q-learning with Nearest Neighbors

We consider the problem of model-free reinforcement learning for infinit...
research
08/25/2018

DNN: A Two-Scale Distributional Tale of Heterogeneous Treatment Effect Inference

Heterogeneous treatment effects are the center of gravity in many modern...
research
05/19/2016

Efficient Nonparametric Smoothness Estimation

Sobolev quantities (norms, inner products, and distances) of probability...

Please sign up or login with your details

Forgot password? Click here to reset