Benign, Tempered, or Catastrophic: A Taxonomy of Overfitting

07/14/2022
by   Neil Mallinar, et al.
13

The practical success of overparameterized neural networks has motivated the recent scientific study of interpolating methods, which perfectly fit their training data. Certain interpolating methods, including neural networks, can fit noisy training data without catastrophically bad test performance, in defiance of standard intuitions from statistical learning theory. Aiming to explain this, a body of recent work has studied benign overfitting, a phenomenon where some interpolating methods approach Bayes optimality, even in the presence of noise. In this work we argue that while benign overfitting has been instructive and fruitful to study, many real interpolating methods like neural networks do not fit benignly: modest noise in the training set causes nonzero (but non-infinite) excess risk at test time, implying these models are neither benign nor catastrophic but rather fall in an intermediate regime. We call this intermediate regime tempered overfitting, and we initiate its systematic study. We first explore this phenomenon in the context of kernel (ridge) regression (KR) by obtaining conditions on the ridge parameter and kernel eigenspectrum under which KR exhibits each of the three behaviors. We find that kernels with powerlaw spectra, including Laplace kernels and ReLU neural tangent kernels, exhibit tempered overfitting. We then empirically study deep neural networks through the lens of our taxonomy, and find that those trained to interpolation are tempered, while those stopped early are benign. We hope our work leads to a more refined understanding of overfitting in modern learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/07/2023

Benign Overfitting for Two-layer ReLU Networks

Modern deep learning models with great expressive power can be trained t...
research
02/05/2018

To understand deep learning we need to understand kernel learning

Generalization performance of classifiers in deep learning has recently ...
research
06/22/2023

An Agnostic View on the Cost of Overfitting in (Kernel) Ridge Regression

We study the cost of overfitting in noisy kernel ridge regression (KRR),...
research
06/11/2020

A new measure for overfitting and its implications for backdooring of deep learning

Overfitting describes the phenomenon that a machine learning model fits ...
research
05/24/2023

From Tempered to Benign Overfitting in ReLU Neural Networks

Overparameterized neural networks (NNs) are observed to generalize well ...
research
05/23/2023

Mind the spikes: Benign overfitting of kernels and neural networks in fixed dimension

The success of over-parameterized neural networks trained to near-zero t...
research
09/04/2022

Towards Understanding the Overfitting Phenomenon of Deep Click-Through Rate Prediction Models

Deep learning techniques have been applied widely in industrial recommen...

Please sign up or login with your details

Forgot password? Click here to reset