Speculate-Correct Error Bounds for k-Nearest Neighbor Classifiers

10/09/2014
by   Eric Bax, et al.
0

We introduce the speculate-correct method to derive error bounds for local classifiers. Using it, we show that k nearest neighbor classifiers, in spite of their famously fractured decision boundaries, have exponential error bounds with O(sqrt((k + ln n) / n)) error bound range for n in-sample examples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/03/2020

Nearest neighbor representations of Boolean functions

A nearest neighbor representation of a Boolean function is a set of posi...
research
10/17/2014

A Hierarchical Multi-Output Nearest Neighbor Model for Multi-Output Dependence Learning

Multi-Output Dependence (MOD) learning is a generalization of standard c...
research
11/25/2022

Doubly robust nearest neighbors in factor models

In this technical note, we introduce an improved variant of nearest neig...
research
08/16/2023

Two Phases of Scaling Laws for Nearest Neighbor Classifiers

A scaling law refers to the observation that the test performance of a m...
research
10/02/2012

Nonparametric Unsupervised Classification

Unsupervised classification methods learn a discriminative classifier fr...
research
03/31/2015

Improved Error Bounds Based on Worst Likely Assignments

Error bounds based on worst likely assignments use permutation tests to ...
research
04/09/2020

Multiclass Classification via Class-Weighted Nearest Neighbors

We study statistical properties of the k-nearest neighbors algorithm for...

Please sign up or login with your details

Forgot password? Click here to reset