A Bayes consistent 1-NN classifier

07/01/2014
by   Aryeh Kontorovich, et al.
0

We show that a simple modification of the 1-nearest neighbor classifier yields a strongly Bayes consistent learner. Prior to this work, the only strongly Bayes consistent proximity-based method was the k-nearest neighbor classifier, for k growing appropriately with sample size. We will argue that a margin-regularized 1-NN enjoys considerable statistical and algorithmic advantages over the k-NN classifier. These include user-friendly finite-sample error bounds, as well as time- and memory-efficient learning and test-point evaluation algorithms with a principled speed-accuracy tradeoff. Encouraging empirical results are reported.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/27/2019

Guarantees on Nearest-Neighbor Condensation heuristics

The problem of nearest-neighbor (NN) condensation aims to reduce the siz...
research
10/07/2019

Fast and Bayes-consistent nearest neighbors

Research on nearest-neighbor methods tends to focus somewhat dichotomous...
research
05/03/2011

Pruning nearest neighbor cluster trees

Nearest neighbor (k-NN) graphs are widely used in machine learning and d...
research
10/19/2011

Is the k-NN classifier in high dimensions affected by the curse of dimensionality?

There is an increasing body of evidence suggesting that exact nearest ne...
research
06/24/2019

Universal Bayes consistency in metric spaces

We show that a recently proposed 1-nearest-neighbor-based multiclass lea...
research
07/01/2010

Survey of Nearest Neighbor Techniques

The nearest neighbor (NN) technique is very simple, highly efficient and...
research
11/29/2015

k-Nearest Neighbour Classification of Datasets with a Family of Distances

The k-nearest neighbour (k-NN) classifier is one of the oldest and most ...

Please sign up or login with your details

Forgot password? Click here to reset