Monotone Learning

02/10/2022
by   Olivier Bousquet, et al.
0

The amount of training-data is one of the key factors which determines the generalization capacity of learning algorithms. Intuitively, one expects the error rate to decrease as the amount of training-data increases. Perhaps surprisingly, natural attempts to formalize this intuition give rise to interesting and challenging mathematical questions. For example, in their classical book on pattern recognition, Devroye, Gyorfi, and Lugosi (1996) ask whether there exists a monotone Bayes-consistent algorithm. This question remained open for over 25 years, until recently Pestov (2021) resolved it for binary classification, using an intricate construction of a monotone Bayes-consistent algorithm. We derive a general result in multiclass classification, showing that every learning algorithm A can be transformed to a monotone one with similar performance. Further, the transformation is efficient and only uses a black-box oracle access to A. This demonstrates that one can provably avoid non-monotonic behaviour without compromising performance, thus answering questions asked by Devroye et al (1996), Viering, Mey, and Loog (2019), Viering and Loog (2021), and by Mhammedi (2021). Our transformation readily implies monotone learners in a variety of contexts: for example it extends Pestov's result to classification tasks with an arbitrary number of labels. This is in contrast with Pestov's work which is tailored to binary classification. In addition, we provide uniform bounds on the error of the monotone algorithm. This makes our transformation applicable in distribution-free settings. For example, in PAC learning it implies that every learnable class admits a monotone PAC learner. This resolves questions by Viering, Mey, and Loog (2019); Viering and Loog (2021); Mhammedi (2021).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/25/2019

Making Learners (More) Monotone

Learning performance can show non-monotonic behavior. That is, more data...
research
06/02/2020

On the Equivalence between Online and Private Learnability beyond Binary Classification

Alon et al. [2019] and Bun et al. [2020] recently showed that online lea...
research
03/27/2023

Lifting uniform learners via distributional decomposition

We show how any PAC learning algorithm that works under the uniform dist...
research
04/05/2023

Agnostic proper learning of monotone functions: beyond the black-box correction barrier

We give the first agnostic, efficient, proper learning algorithm for mon...
research
10/22/2020

Reducing Adversarially Robust Learning to Non-Robust PAC Learning

We study the problem of reducing adversarially robust learning to standa...
research
11/28/2020

Risk-Monotonicity in Statistical Learning

Acquisition of data is a difficult task in many applications of machine ...
research
07/05/2023

Universal Rates for Multiclass Learning

We study universal rates for multiclass classification, establishing the...

Please sign up or login with your details

Forgot password? Click here to reset