Rates of Uniform Consistency for k-NN Regression

07/19/2017
by   Heinrich Jiang, et al.
0

We derive high-probability finite-sample uniform rates of consistency for k-NN regression that are optimal up to logarithmic factors under mild assumptions. We moreover show that k-NN regression adapts to an unknown lower intrinsic dimension automatically. We then apply the k-NN regression rates to establish new results about estimating the level sets and global maxima of a function from noisy observations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/19/2011

k-NN Regression Adapts to Local Intrinsic Dimension

Many nonparametric regressors were recently shown to converge at rates t...
research
01/13/2021

Multiscale regression on unknown manifolds

We consider the regression problem of estimating functions on ℝ^D but su...
research
02/14/2019

Classification with unknown class conditional label noise on non-compact feature spaces

We investigate the problem of classification in the presence of unknown ...
research
04/20/2022

Deep Learning meets Nonparametric Regression: Are Weight-Decayed DNNs Locally Adaptive?

We study the theory of neural network (NN) from the lens of classical no...
research
02/09/2021

Label Smoothed Embedding Hypothesis for Out-of-Distribution Detection

Detecting out-of-distribution (OOD) examples is critical in many applica...
research
10/07/2021

Neural Estimation of Statistical Divergences

Statistical divergences (SDs), which quantify the dissimilarity between ...
research
10/01/2020

Universal consistency and rates of convergence of multiclass prototype algorithms in metric spaces

We study universal consistency and convergence rates of simple nearest-n...

Please sign up or login with your details

Forgot password? Click here to reset