On Convergence of Nearest Neighbor Classifiers over Feature Transformations

10/15/2020
by   Luka Rimanic, et al.
13

The k-Nearest Neighbors (kNN) classifier is a fundamental non-parametric machine learning algorithm. However, it is well known that it suffers from the curse of dimensionality, which is why in practice one often applies a kNN classifier on top of a (pre-trained) feature transformation. From a theoretical perspective, most, if not all theoretical results aimed at understanding the kNN classifier are derived for the raw feature space. This leads to an emerging gap between our theoretical understanding of kNN and its practical applications. In this paper, we take a first step towards bridging this gap. We provide a novel analysis on the convergence rates of a kNN classifier over transformed features. This analysis requires in-depth understanding of the properties that connect both the transformed space and the raw feature space. More precisely, we build our convergence bound upon two key properties of the transformed space: (1) safety – how well can one recover the raw posterior from the transformed space, and (2) smoothness – how complex this recovery function is. Based on our result, we are able to explain why some (pre-trained) feature transformations are better suited for a kNN classifier than other. We empirically validate that both properties have an impact on the kNN convergence on 30 feature transformations with 6 benchmark datasets spanning from the vision to the text domain.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/14/2019

Feature space transformations and model selection to improve the performance of classifiers

Improving the performance of classifiers is the realm of prototype selec...
research
11/19/2022

A Two-Stage Active Learning Algorithm for k-Nearest Neighbors

We introduce a simple and intuitive two-stage active learning algorithm ...
research
08/20/2019

Multi-hypothesis classifier

Accuracy is the most important parameter among few others which defines ...
research
10/19/2017

Interpretable Transformations with Encoder-Decoder Networks

Deep feature spaces have the capacity to encode complex transformations ...
research
02/08/2020

Intrinsic Dimension Estimation via Nearest Constrained Subspace Classifier

We consider the problems of classification and intrinsic dimension estim...
research
08/29/2023

An Adaptive Tangent Feature Perspective of Neural Networks

In order to better understand feature learning in neural networks, we pr...
research
10/11/2022

Optimal AdaBoost Converges

The following work is a preprint collection of formal proofs regarding t...

Please sign up or login with your details

Forgot password? Click here to reset