A Fast and Easy Regression Technique for k-NN Classification Without Using Negative Pairs

06/11/2018
by   Yutaro Shigeto, et al.
0

This paper proposes an inexpensive way to learn an effective dissimilarity function to be used for k-nearest neighbor (k-NN) classification. Unlike Mahalanobis metric learning methods that map both query (unlabeled) objects and labeled objects to new coordinates by a single transformation, our method learns a transformation of labeled objects to new points in the feature space whereas query objects are kept in their original coordinates. This method has several advantages over existing distance metric learning methods: (i) In experiments with large document and image datasets, it achieves k-NN classification accuracy better than or at least comparable to the state-of-the-art metric learning methods. (ii) The transformation can be learned efficiently by solving a standard ridge regression problem. For document and image datasets, training is often more than two orders of magnitude faster than the fastest metric learning methods tested. This speed-up is also due to the fact that the proposed method eliminates the optimization over "negative" object pairs, i.e., objects whose class labels are different. (iii) The formulation has a theoretical justification in terms of reducing hubness in data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/22/2019

Adaptive Nearest Neighbor: A General Framework for Distance Metric Learning

K-NN classifier is one of the most famous classification algorithms, who...
research
01/03/2012

Random Forests for Metric Learning with Implicit Pairwise Position Dependence

Metric learning makes it plausible to learn distances for complex distri...
research
02/04/2022

Active metric learning and classification using similarity queries

Active learning is commonly used to train label-efficient models by adap...
research
06/10/2020

Towards Certified Robustness of Metric Learning

Metric learning aims to learn a distance metric such that semantically s...
research
07/11/2016

Learning a metric for class-conditional KNN

Naive Bayes Nearest Neighbour (NBNN) is a simple and effective framework...
research
09/02/2022

Learning task-specific features for 3D pointcloud graph creation

Processing 3D pointclouds with Deep Learning methods is not an easy task...
research
12/28/2021

Improving Nonparametric Classification via Local Radial Regression with an Application to Stock Prediction

For supervised classification problems, this paper considers estimating ...

Please sign up or login with your details

Forgot password? Click here to reset