Tight bounds for minimum l1-norm interpolation of noisy data

11/10/2021
by   Guillaume Wang, et al.
0

We provide matching upper and lower bounds of order σ^2/log(d/n) for the prediction error of the minimum ℓ_1-norm interpolator, a.k.a. basis pursuit. Our result is tight up to negligible terms when d ≫ n, and is the first to imply asymptotic consistency of noisy minimum-norm interpolation for isotropic features and sparse ground truths. Our work complements the literature on "benign overfitting" for minimum ℓ_2-norm interpolation, where asymptotic consistency can be achieved only when the features are effectively low-dimensional.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/17/2021

Uniform Convergence of Interpolators: Gaussian Width, Norm Bounds, and Benign Overfitting

We consider interpolation learning in high-dimensional linear regression...
research
07/28/2023

Noisy Interpolation Learning with Shallow Univariate ReLU Networks

We study the asymptotic overfitting behavior of interpolation with minim...
research
12/28/2018

Consistency of Interpolation with Laplace Kernels is a High-Dimensional Phenomenon

We show that minimum-norm interpolation in the Reproducing Kernel Hilber...
research
12/01/2020

On the robustness of minimum-norm interpolators

This article develops a general theory for minimum-norm interpolated est...
research
12/07/2022

Tight bounds for maximum ℓ_1-margin classifiers

Popular iterative algorithms such as boosting methods and coordinate des...
research
10/06/2021

Foolish Crowds Support Benign Overfitting

We prove a lower bound on the excess risk of sparse interpolating proced...
research
11/28/2018

Basis Pursuit Denoise with Nonsmooth Constraints

Level-set optimization formulations with data-driven constraints minimiz...

Please sign up or login with your details

Forgot password? Click here to reset