DeepAI
Log In Sign Up

Extrapolation Towards Imaginary 0-Nearest Neighbour and Its Improved Convergence Rate

02/08/2020
by   Akifumi Okuno, et al.
0

k-nearest neighbour (k-NN) is one of the simplest and most widely-used methods for supervised classification, that predicts a query's label by taking weighted ratio of observed labels of k objects nearest to the query. The weights and the parameter k ∈N regulate its bias-variance trade-off, and the trade-off implicitly affects the convergence rate of the excess risk for the k-NN classifier; several existing studies considered selecting optimal k and weights to obtain faster convergence rate. Whereas k-NN with non-negative weights has been developed widely, it was proved that negative weights are essential for eradicating the bias terms and attaining optimal convergence rate. However, computation of the optimal weights requires solving entangled equations. Thus, other simpler approaches that can find optimal real-valued weights are appreciated in practice. In this paper, we propose multiscale k-NN (MS-k-NN), that extrapolates unweighted k-NN estimators from several k > 1 values to k=0, thus giving an imaginary 0-NN estimator. MS-k-NN implicitly corresponds to an adaptive method for finding favorable real-valued weights, and we theoretically prove that the MS-k-NN attains the improved rate, that coincides with the existing optimal rate under some conditions.

READ FULL TEXT

page 1

page 2

page 3

page 4

05/20/2021

Distributed Adaptive Nearest Neighbor Classifier: Algorithm and Theory

When data is of an extraordinarily large size or physically stored in di...
12/28/2021

Improving Nonparametric Classification via Local Radial Regression with an Application to Stock Prediction

For supervised classification problems, this paper considers estimating ...
03/28/2016

Analysis of k-Nearest Neighbor Distances with Application to Entropy Estimation

Estimating entropy and mutual information consistently is important for ...
03/26/2018

Efficient space virtualisation for Hoshen--Kopelman algorithm

In this paper the efficient space virtualisation for Hoshen--Kopelman al...
02/06/2020

Almost Sure Convergence of Dropout Algorithms for Neural Networks

We investigate the convergence and convergence rate of stochastic traini...
10/07/2021

Neural Estimation of Statistical Divergences

Statistical divergences (SDs), which quantify the dissimilarity between ...