On the Minimal Error of Empirical Risk Minimization

02/24/2021
by   Gil Kur, et al.
7

We study the minimal error of the Empirical Risk Minimization (ERM) procedure in the task of regression, both in the random and the fixed design settings. Our sharp lower bounds shed light on the possibility (or impossibility) of adapting to simplicity of the model generating the data. In the fixed design setting, we show that the error is governed by the global complexity of the entire class. In contrast, in random design, ERM may only adapt to simpler models if the local neighborhoods around the regression function are nearly as complex as the class itself, a somewhat counter-intuitive conclusion. We provide sharp lower bounds for performance of ERM for both Donsker and non-Donsker classes. We also discuss our results through the lens of recent studies on interpolation in overparameterized models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/09/2022

Lower bounds for piecewise polynomial approximations of oscillatory functions

We prove sharp lower bounds on the error incurred when approximating any...
research
06/14/2017

Strong converse bounds for high-dimensional estimation

In statistical inference problems, we wish to obtain lower bounds on the...
research
08/01/2023

Sharp Taylor Polynomial Enclosures in One Dimension

It is often useful to have polynomial upper or lower bounds on a one-dim...
research
02/27/2018

Multi-Observation Regression

Recent work introduced loss functions which measure the error of a predi...
research
05/29/2023

On the Variance, Admissibility, and Stability of Empirical Risk Minimization

It is well known that Empirical Risk Minimization (ERM) with squared los...
research
06/29/2023

Local Risk Bounds for Statistical Aggregation

In the problem of aggregation, the aim is to combine a given class of ba...

Please sign up or login with your details

Forgot password? Click here to reset