DeepAI AI Chat
Log In Sign Up

Under-bagging Nearest Neighbors for Imbalanced Classification

09/01/2021
by   Hanyuan Hang, et al.
0

In this paper, we propose an ensemble learning algorithm called under-bagging k-nearest neighbors (under-bagging k-NN) for imbalanced classification problems. On the theoretical side, by developing a new learning theory analysis, we show that with properly chosen parameters, i.e., the number of nearest neighbors k, the expected sub-sample size s, and the bagging rounds B, optimal convergence rates for under-bagging k-NN can be achieved under mild assumptions w.r.t. the arithmetic mean (AM) of recalls. Moreover, we show that with a relatively small B, the expected sub-sample size s can be much smaller than the number of training data n at each bagging round, and the number of nearest neighbors k can be reduced simultaneously, especially when the data are highly imbalanced, which leads to substantially lower time complexity and roughly the same space complexity. On the practical side, we conduct numerical experiments to verify the theoretical results on the benefits of the under-bagging technique by the promising AM performance and efficiency of our proposed algorithm.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/18/2022

Bagged k-Distance for Mode-Based Clustering Using the Probability of Localized Level Sets

In this paper, we propose an ensemble learning algorithm named bagged k-...
05/20/2021

Distributed Adaptive Nearest Neighbor Classifier: Algorithm and Theory

When data is of an extraordinarily large size or physically stored in di...
04/05/2020

A new hashing based nearest neighbors selection technique for big datasets

KNN has the reputation to be the word simplest but efficient supervised ...
02/13/2020

Predictive Power of Nearest Neighbors Algorithm under Random Perturbation

We consider a data corruption scenario in the classical k Nearest Neighb...
06/01/2018

k-nearest neighbors prediction and classification for spatial data

We propose a nonparametric predictor and a supervised classification bas...
04/27/2018

k-Nearest Neighbors by Means of Sequence to Sequence Deep Neural Networks and Memory Networks

k-Nearest Neighbors is one of the most fundamental but effective classif...