Universal consistency and rates of convergence of multiclass prototype algorithms in metric spaces

10/01/2020
by   László Györfi, et al.
0

We study universal consistency and convergence rates of simple nearest-neighbor prototype rules for the problem of multiclass classification in metric paces. We first show that a novel data-dependent partitioning rule, named Proto-NN, is universally consistent in any metric space that admits a universally consistent rule. Proto-NN is a significant simplification of OptiNet, a recently proposed compression-based algorithm that, to date, was the only algorithm known to be universally consistent in such a general setting. Practically, Proto-NN is simpler to implement and enjoys reduced computational complexity. We then proceed to study convergence rates of the excess error probability. We first obtain rates for the standard k-NN rule under a margin condition and a new generalized-Lipschitz condition. The latter is an extension of a recently proposed modified-Lipschitz condition from ℝ^d to metric spaces. Similarly to the modified-Lipschitz condition, the new condition avoids any boundness assumptions on the data distribution. While obtaining rates for Proto-NN is left open, we show that a second prototype rule that hybridizes between k-NN and Proto-NN achieves the same rates as k-NN while enjoying similar computational advantages as Proto-NN. We conjecture however that, as k-NN, this hybrid rule is not consistent in general.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/16/2022

On Error and Compression Rates for Prototype Rules

We study the close interplay between error and compression in the non-pa...
research
05/26/2023

Universal consistency of the k-NN rule in metric spaces and Nagata dimension. II

We continue to investigate the k nearest neighbour learning rule in sepa...
research
06/24/2019

Universal Bayes consistency in metric spaces

We show that a recently proposed 1-nearest-neighbor-based multiclass lea...
research
09/10/2020

Universal consistency of Wasserstein k-NN classifier

The Wasserstein distance provides a notion of dissimilarities between pr...
research
02/28/2020

Universal consistency of the k-NN rule in metric spaces and Nagata dimension

The k nearest neighbour learning rule (under the uniform distance tie br...
research
07/19/2017

Rates of Uniform Consistency for k-NN Regression

We derive high-probability finite-sample uniform rates of consistency fo...
research
12/19/2018

Convergence Rates for the Generalized Fréchet Mean via the Quadruple Inequality

For sets Q and Y, the generalized Fréchet mean m ∈ Q of a random varia...

Please sign up or login with your details

Forgot password? Click here to reset