Nearest neighbor process: weak convergence and non-asymptotic bound

10/27/2021
by   François Portier, et al.
0

An empirical measure that results from the nearest neighbors to a given point - the nearest neighbor measure - is introduced and studied as a central statistical quantity. First, the resulting empirical process is shown to satisfy a uniform central limit theorem under a (local) bracketing entropy condition on the underlying class of functions (reflecting the localizing nature of nearest neighbor algorithm). Second a uniform non-asymptotic bound is established under a well-known condition, often refereed to as Vapnik-Chervonenkis, on the uniform entropy numbers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/08/2020

A Nearest Neighbor Characterization of Lebesgue Points in Metric Measure Spaces

The property of almost every point being a Lebesgue point has proven to ...
research
10/29/2020

How do Offline Measures for Exploration in Reinforcement Learning behave?

Sufficient exploration is paramount for the success of a reinforcement l...
research
04/09/2020

Multiclass Classification via Class-Weighted Nearest Neighbors

We study statistical properties of the k-nearest neighbors algorithm for...
research
02/25/2021

On the consistency of the Kozachenko-Leonenko entropy estimate

We revisit the problem of the estimation of the differential entropy H(f...
research
04/20/2015

Nonparametric Nearest Neighbor Random Process Clustering

We consider the problem of clustering noisy finite-length observations o...
research
06/07/2023

K-Nearest-Neighbor Resampling for Off-Policy Evaluation in Stochastic Control

We propose a novel K-nearest neighbor resampling procedure for estimatin...
research
09/15/2006

Non-photorealistic image rendering with a labyrinthine tiling

The paper describes a new image processing for a non-photorealistic rend...

Please sign up or login with your details

Forgot password? Click here to reset