Predictive Power of Nearest Neighbors Algorithm under Random Perturbation

02/13/2020
by   Yue Xing, et al.
0

We consider a data corruption scenario in the classical k Nearest Neighbors (k-NN) algorithm, that is, the testing data are randomly perturbed. Under such a scenario, the impact of corruption level on the asymptotic regret is carefully characterized. In particular, our theoretical analysis reveals a phase transition phenomenon that, when the corruption level ω is below a critical order (i.e., small-ω regime), the asymptotic regret remains the same; when it is beyond that order (i.e., large-ω regime), the asymptotic regret deteriorates polynomially. Surprisingly, we obtain a negative result that the classical noise-injection approach will not help improve the testing performance in the beginning stage of the large-ω regime, even in the level of the multiplicative constant of asymptotic regret. As a technical by-product, we prove that under different model assumptions, the pre-processed 1-NN proposed in <cit.> will at most achieve a sub-optimal rate when the data dimension d>4 even if k is chosen optimally in the pre-processing step.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/25/2019

Benefit of Interpolation in Nearest Neighbor Algorithms

The over-parameterized models attract much attention in the era of data ...
research
09/01/2021

Under-bagging Nearest Neighbors for Imbalanced Classification

In this paper, we propose an ensemble learning algorithm called under-ba...
research
04/07/2022

Quantum version of the k-NN classifier based on a quantum sorting algorithm

In this work we introduce a quantum sorting algorithm with adaptable req...
research
01/14/2022

Uniformly accurate integrators for Klein-Gordon-Schrödinger systems from the classical to non-relativistic limit regime

In this paper we present a novel class of asymptotic consistent exponent...
research
11/08/2020

Asymptotic Convergence of Thompson Sampling

Thompson sampling has been shown to be an effective policy across a vari...
research
01/25/2021

Diffusion Asymptotics for Sequential Experiments

We propose a new diffusion-asymptotic analysis for sequentially randomiz...

Please sign up or login with your details

Forgot password? Click here to reset