Minimizers of the Empirical Risk and Risk Monotonicity

07/11/2019
by   Marco Loog, et al.
0

Plotting a learner's average performance against the number of training samples results in a learning curve. Studying such curves on one or more data sets is a way to get to a better understanding of the generalization properties of this learner. The behavior of learning curves is, however, not very well understood and can display (for most researchers) quite unexpected behavior. Our work introduces the formal notion of risk monotonicity, which asks the risk to not deteriorate with increasing training set sizes in expectation over the training samples. We then present the surprising result that various standard learners, specifically those that minimize the empirical risk, can act nonmonotonically irrespective of the training sample size. We provide a theoretical underpinning for specific instantiations from classification, regression, and density estimation. Altogether, the proposed monotonicity notion opens up a whole new direction of research.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/25/2022

A Survey of Learning Curves with Bad Behavior: or How More Data Need Not Lead to Better Performance

Plotting a learner's generalization performance against the training set...
research
06/08/2022

Estimation of Predictive Performance in High-Dimensional Data Settings using Learning Curves

In high-dimensional prediction settings, it remains challenging to relia...
research
03/19/2021

The Shape of Learning Curves: a Review

Learning curves provide insight into the dependence of a learner's gener...
research
04/07/2020

A Brief Prehistory of Double Descent

In their thought-provoking paper [1], Belkin et al. illustrate and discu...
research
06/30/2021

Learning Bounds for Open-Set Learning

Traditional supervised learning aims to train a classifier in the closed...
research
10/16/2021

Towards Robust Waveform-Based Acoustic Models

We propose an approach for learning robust acoustic models in adverse en...
research
11/11/2019

Rethinking Generalisation

In this paper, we present a new approach to computing the generalisation...

Please sign up or login with your details

Forgot password? Click here to reset