Optimal 1-NN Prototypes for Pathological Geometries

10/31/2020
by   Ilia Sucholutsky, et al.
11

Using prototype methods to reduce the size of training datasets can drastically reduce the computational cost of classification with instance-based learning algorithms like the k-Nearest Neighbour classifier. The number and distribution of prototypes required for the classifier to match its original performance is intimately related to the geometry of the training data. As a result, it is often difficult to find the optimal prototypes for a given dataset, and heuristic algorithms are used instead. However, we consider a particularly challenging setting where commonly used heuristic algorithms fail to find suitable prototypes and show that the optimal prototypes can instead be found analytically. We also propose an algorithm for finding nearly-optimal prototypes in this setting, and use it to empirically validate the theoretical results.

READ FULL TEXT

page 1

page 2

research
11/17/2019

Sparse ℓ_1 and ℓ_2 Center Classifiers

The nearest-centroid classifier is a simple linear-time classifier based...
research
10/19/2011

Is the k-NN classifier in high dimensions affected by the curse of dimensionality?

There is an increasing body of evidence suggesting that exact nearest ne...
research
05/20/2021

Distributed Adaptive Nearest Neighbor Classifier: Algorithm and Theory

When data is of an extraordinarily large size or physically stored in di...
research
06/10/2021

An Instance-optimal Algorithm for Bichromatic Rectangular Visibility

Afshani, Barbay and Chan (2017) introduced the notion of instance-optima...
research
02/15/2021

One Line To Rule Them All: Generating LO-Shot Soft-Label Prototypes

Increasingly large datasets are rapidly driving up the computational cos...

Please sign up or login with your details

Forgot password? Click here to reset