Rethinking Nearest Neighbors for Visual Classification

12/15/2021
by   Menglin Jia, et al.
23

Neural network classifiers have become the de-facto choice for current "pre-train then fine-tune" paradigms of visual classification. In this paper, we investigate k-Nearest-Neighbor (k-NN) classifiers, a classical model-free learning method from the pre-deep learning era, as an augmentation to modern neural network based approaches. As a lazy learning method, k-NN simply aggregates the distance between the test image and top-k neighbors in a training set. We adopt k-NN with pre-trained visual representations produced by either supervised or self-supervised methods in two steps: (1) Leverage k-NN predicted probabilities as indications for easy vs. hard examples during training. (2) Linearly interpolate the k-NN predicted distribution with that of the augmented classifier. Via extensive experiments on a wide range of classification tasks, our study reveals the generality and flexibility of k-NN integration with additional insights: (1) k-NN achieves competitive results, sometimes even outperforming a standard linear classifier. (2) Incorporating k-NN is especially beneficial for tasks where parametric classifiers perform poorly and / or in low-data regimes. We hope these discoveries will encourage people to rethink the role of pre-deep learning, classical methods in computer vision. Our code is available at: https://github.com/KMnP/nn-revisit.

READ FULL TEXT

page 3

page 4

page 5

page 7

page 10

page 11

research
05/15/2021

Mean Shift for Self-Supervised Learning

Most recent self-supervised learning (SSL) algorithms learn features by ...
research
04/21/2022

AU-NN: ANFIS Unit Neural Network

In this paper is described the ANFIS Unit Neural Network, a deep neural ...
research
06/17/2008

Supervised functional classification: A theoretical remark and some comparisons

The problem of supervised classification (or discrimination) with functi...
research
01/16/2023

On Using Deep Learning Proxies as Forward Models in Deep Learning Problems

Physics-based optimization problems are generally very time-consuming, e...
research
01/19/2021

Variance Based Samples Weighting for Supervised Deep Learning

In the context of supervised learning of a function by a Neural Network ...
research
03/14/2023

Parameter is Not All You Need: Starting from Non-Parametric Networks for 3D Point Cloud Analysis

We present a Non-parametric Network for 3D point cloud analysis, Point-N...
research
01/08/2010

Boosting k-NN for categorization of natural scenes

The k-nearest neighbors (k-NN) classification rule has proven extremely ...

Please sign up or login with your details

Forgot password? Click here to reset