On Error and Compression Rates for Prototype Rules

06/16/2022
by   Omer Kerem, et al.
0

We study the close interplay between error and compression in the non-parametric multiclass classification setting in terms of prototype learning rules. We focus in particular on a close variant of a recently proposed compression-based learning rule termed OptiNet. Beyond its computational merits, this rule has been recently shown to be universally consistent in any metric instance space that admits a universally consistent rule – the first learning algorithm known to enjoy this property. However, its error and compression rates have been left open. Here we derive such rates in the case where instances reside in Euclidean space under commonly posed smoothness and tail conditions on the data distribution. We first show that OptiNet achieves non-trivial compression rates while enjoying near minimax-optimal error rates. We then proceed to study a novel general compression scheme for further compressing prototype rules that locally adapts to the noise level without sacrificing accuracy. Applying it to OptiNet, we show that under a geometric margin condition, further gain in the compression rate is achieved. Experimental results comparing the performance of the various methods are presented.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/01/2020

Universal consistency and rates of convergence of multiclass prototype algorithms in metric spaces

We study universal consistency and convergence rates of simple nearest-n...
research
12/22/2015

Refined Error Bounds for Several Learning Algorithms

This article studies the achievable guarantees on the error rates of cer...
research
08/22/2021

A universally consistent learning rule with a universally monotone error

We present a universally consistent learning rule whose expected error i...
research
07/25/2022

Optimal Convergence Rates of Deep Neural Networks in a Classification Setting

We establish optimal convergence rates up to a log-factor for a class of...
research
09/29/2021

Error rate control for classification rules in multiclass mixture models

In the context of finite mixture models one considers the problem of cla...
research
09/06/2022

Compression Optimality of Asymmetric Numeral Systems

Compression also known as entropy coding has a rich and long history. Ho...
research
10/28/2019

Fast classification rates without standard margin assumptions

We consider the classical problem of learning rates for classes with fin...

Please sign up or login with your details

Forgot password? Click here to reset