Foolish Crowds Support Benign Overfitting

10/06/2021
by   Niladri S. Chatterji, et al.
0

We prove a lower bound on the excess risk of sparse interpolating procedures for linear regression with Gaussian data in the overparameterized regime. We apply this result to obtain a lower bound for basis pursuit (the minimum ℓ_1-norm interpolant) that implies that its excess risk can converge at an exponentially slower rate than OLS (the minimum ℓ_2-norm interpolant), even when the ground truth is sparse. Our analysis exposes the benefit of an effect analogous to the "wisdom of the crowd", except here the harm arising from fitting the noise is ameliorated by spreading it among many directions – the variance reduction arises from a foolish crowd.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/17/2021

Uniform Convergence of Interpolators: Gaussian Width, Norm Bounds, and Benign Overfitting

We consider interpolation learning in high-dimensional linear regression...
research
09/19/2022

Deep Linear Networks can Benignly Overfit when Shallow Ones Do

We bound the excess risk of interpolating deep linear networks trained u...
research
11/10/2021

Tight bounds for minimum l1-norm interpolation of noisy data

We provide matching upper and lower bounds of order σ^2/log(d/n) for the...
research
02/02/2020

Overfitting Can Be Harmless for Basis Pursuit: Only to a Degree

Recently, there have been significant interests in studying the generali...
research
07/24/2023

Label Noise: Correcting a Correction

Training neural network classifiers on datasets with label noise poses a...
research
03/11/2022

A geometrical viewpoint on the benign overfitting property of the minimum l_2-norm interpolant estimator

Practitioners have observed that some deep learning models generalize we...
research
06/14/2023

Batches Stabilize the Minimum Norm Risk in High Dimensional Overparameterized Linear Regression

Learning algorithms that divide the data into batches are prevalent in m...

Please sign up or login with your details

Forgot password? Click here to reset