Refined Error Bounds for Several Learning Algorithms

12/22/2015
by   Steve Hanneke, et al.
0

This article studies the achievable guarantees on the error rates of certain learning algorithms, with particular focus on refining logarithmic factors. Many of the results are based on a general technique for obtaining bounds on the error rates of sample-consistent classifiers with monotonic error regions, in the realizable case. We prove bounds of this type expressed in terms of either the VC dimension or the sample compression size. This general technique also enables us to derive several new bounds on the error rates of general sample-consistent learning algorithms, as well as refined bounds on the label complexity of the CAL active learning algorithm. Additionally, we establish a simple necessary and sufficient condition for the existence of a distribution-free bound on the error rates of all sample-consistent learning rules, converging at a rate inversely proportional to the sample size. We also study learning in the presence of classification noise, deriving a new excess error rate guarantee for general VC classes under Tsybakov's noise condition, and establishing a simple and general necessary and sufficient condition for the minimax excess risk under bounded noise to converge at a rate inversely proportional to the sample size.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/20/2014

Noise-adaptive Margin-based Active Learning and Lower Bounds under Tsybakov Noise Condition

We present a simple noise-robust margin-based active learning algorithm ...
research
06/16/2022

On Error and Compression Rates for Prototype Rules

We study the close interplay between error and compression in the non-pa...
research
11/06/2020

Revisiting Model-Agnostic Private Learning: Faster Rates and Active Learning

The Private Aggregation of Teacher Ensembles (PATE) framework is one of ...
research
01/15/2014

Transductive Rademacher Complexity and its Applications

We develop a technique for deriving data-dependent error bounds for tran...
research
04/07/2021

Fast Convergence on Perfect Classification for Functional Data

In this study, we investigate the availability of approaching to perfect...
research
06/17/2020

Interpolation and Learning with Scale Dependent Kernels

We study the learning properties of nonparametric ridge-less least squar...
research
08/22/2021

A universally consistent learning rule with a universally monotone error

We present a universally consistent learning rule whose expected error i...

Please sign up or login with your details

Forgot password? Click here to reset