Boosting k-NN for categorization of natural scenes

01/08/2010
by   Paolo Piro, et al.
0

The k-nearest neighbors (k-NN) classification rule has proven extremely successful in countless many computer vision applications. For example, image categorization often relies on uniform voting among the nearest prototypes in the space of descriptors. In spite of its good properties, the classic k-NN rule suffers from high variance when dealing with sparse prototype datasets in high dimensions. A few techniques have been proposed to improve k-NN classification, which rely on either deforming the nearest neighborhood relationship or modifying the input space. In this paper, we propose a novel boosting algorithm, called UNN (Universal Nearest Neighbors), which induces leveraged k-NN, thus generalizing the classic k-NN rule. We redefine the voting rule as a strong classifier that linearly combines predictions from the k closest prototypes. Weak classifiers are learned by UNN so as to minimize a surrogate risk. A major feature of UNN is the ability to learn which prototypes are the most relevant for a given class, thus allowing one for effective data reduction. Experimental results on the synthetic two-class dataset of Ripley show that such a filtering strategy is able to reject "noisy" prototypes. We carried out image categorization experiments on a database containing eight classes of natural scenes. We show that our method outperforms significantly the classic k-NN classification, while enabling significant reduction of the computational cost by means of data filtering.

READ FULL TEXT

page 9

page 14

page 15

research
04/21/2018

Dynamic Ensemble Selection VS K-NN: why and when Dynamic Selection obtains higher classification performance?

Multiple classifier systems focus on the combination of classifiers to o...
research
10/11/2018

A Theory-Based Evaluation of Nearest Neighbor Models Put Into Practice

In the k-nearest neighborhood model (k-NN), we are given a set of points...
research
03/26/2018

Efficient space virtualisation for Hoshen--Kopelman algorithm

In this paper the efficient space virtualisation for Hoshen--Kopelman al...
research
07/11/2016

Learning a metric for class-conditional KNN

Naive Bayes Nearest Neighbour (NBNN) is a simple and effective framework...
research
06/26/2023

Efficient High-Resolution Template Matching with Vector Quantized Nearest Neighbour Fields

Template matching is a fundamental problem in computer vision and has ap...
research
12/15/2021

Rethinking Nearest Neighbors for Visual Classification

Neural network classifiers have become the de-facto choice for current "...
research
04/16/2021

Achieving differential privacy for k-nearest neighbors based outlier detection by data partitioning

When applying outlier detection in settings where data is sensitive, mec...

Please sign up or login with your details

Forgot password? Click here to reset