A universally consistent learning rule with a universally monotone error

08/22/2021
by   Vladimir Pestov, et al.
0

We present a universally consistent learning rule whose expected error is monotone non-increasing with the sample size under every data distribution. The question of existence of such rules was brought up in 1996 by Devroye, Györfi and Lugosi (who called them "smart"). Our rule is fully deterministic, a data-dependent partitioning rule constructed in an arbitrary domain (a standard Borel space) using a cyclic order. The central idea is to only partition at each step those cyclic intervals that exhibit a sufficient empirical diversity of labels, thus avoiding a region where the error function is convex.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/06/2022

Simple Mechanisms for Welfare Maximization in Rich Advertising Auctions

Internet ad auctions have evolved from a few lines of text to richer inf...
research
06/16/2022

On Error and Compression Rates for Prototype Rules

We study the close interplay between error and compression in the non-pa...
research
07/04/2019

Consistent Regression using Data-Dependent Coverings

In this paper, we introduce a novel method to generate interpretable reg...
research
11/30/2020

A randomised trapezoidal quadrature

A randomised trapezoidal quadrature rule is proposed for continuous func...
research
12/22/2015

Refined Error Bounds for Several Learning Algorithms

This article studies the achievable guarantees on the error rates of cer...
research
07/02/2020

Unlinked monotone regression

We consider so-called univariate unlinked (sometimes "decoupled," or "sh...

Please sign up or login with your details

Forgot password? Click here to reset