Fine-Grained Distribution-Dependent Learning Curves

08/31/2022
by   Olivier Bousquet, et al.
0

Learning curves plot the expected error of a learning algorithm as a function of the number of labeled samples it receives from a target distribution. They are widely used as a measure of an algorithm's performance, but classic PAC learning theory cannot explain their behavior. As observed by Antos and Lugosi (1996 , 1998), the classic `No Free Lunch' lower bounds only trace the upper envelope above all learning curves of specific target distributions. For a concept class with VC dimension d the classic bound decays like d/n, yet it is possible that the learning curve for every specific distribution decays exponentially. In this case, for each n there exists a different `hard' distribution requiring d/n samples. Antos and Lugosi asked which concept classes admit a `strong minimax lower bound' – a lower bound of d'/n that holds for a fixed distribution for infinitely many n. We solve this problem in a principled manner, by introducing a combinatorial dimension called VCL that characterizes the best d' for which d'/n is a strong minimax lower bound. Our characterization strengthens the lower bounds of Bousquet, Hanneke, Moran, van Handel, and Yehudayoff (2021), and it refines their theory of learning curves, by showing that for classes with finite VCL the learning rate can be decomposed into a linear component that depends only on the hypothesis class and an exponential component that depends also on the target distribution. As a corollary, we recover the lower bound of Antos and Lugosi (1996 , 1998) for half-spaces in ℝ^d. Finally, to provide another viewpoint on our work and how it compares to traditional PAC learning bounds, we also present an alternative formulation of our results in a language that is closer to the PAC setting.

READ FULL TEXT

page 19

page 22

page 25

research
07/22/2023

The Sample Complexity of Multi-Distribution Learning for VC Classes

Multi-distribution learning is a natural generalization of PAC learning ...
research
01/20/2023

Superpolynomial Lower Bounds for Learning Monotone Classes

Koch, Strassle, and Tan [SODA 2023], show that, under the randomized exp...
research
11/09/2020

A Theory of Universal Learning

How quickly can a given class of concepts be learned from examples? It i...
research
12/07/2020

VC Dimension and Distribution-Free Sample-Based Testing

We consider the problem of determining which classes of functions can be...
research
01/26/2020

Learning the Hypotheses Space from data Part I: Learning Space and U-curve Property

The agnostic PAC learning model consists of: a Hypothesis Space H, a pro...
research
10/07/2021

Pointwise Bounds for Distribution Estimation under Communication Constraints

We consider the problem of estimating a d-dimensional discrete distribut...
research
08/29/2011

Learning Valuation Functions

In this paper we study the approximate learnability of valuations common...

Please sign up or login with your details

Forgot password? Click here to reset