Regress Consistently when Oblivious Outliers Overwhelm

09/30/2020
by   Tommaso d'Orsi, et al.
0

We give a novel analysis of the Huber loss estimator for consistent robust linear regression proving that it simultaneously achieves an optimal dependency on the fraction of outliers and on the dimension. We consider a linear regression model with an oblivious adversary, who may corrupt the observations in an arbitrary way but without knowing the data. (This adversary model also captures heavy-tailed noise distributions). Given observations y_1,…,y_n with an α uncorrupted fraction, we obtain error guarantees Õ(√(d/α^2· n)), optimal up to logarithmic terms. Our algorithm works with a nearly optimal fraction of inliers α≥Õ(√(d/n)) and under mild restricted isometry assumptions (RIP) on the (transposed) design matrix. Prior to this work, even in the simple case of spherical Gaussian design, no estimator was known to achieve vanishing error guarantees in the high dimensional settings d≳√(n), whenever the fraction of uncorrupted observations is smaller than 1/log n. Our analysis of the Huber loss estimator only exploits the first order optimality conditions. Furthermore, in the special case of Gaussian design X∼ N(0,1)^n × d, we show that a strikingly simple algorithm based on computing coordinate-wise medians achieves similar guarantees in linear time. The algorithm also extends to the settings where the parameter vector β^* is sparse.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/19/2019

Adaptive Hard Thresholding for Near-optimal Consistent Robust Regression

We study the problem of robust linear regression with response variable ...
research
11/04/2021

Consistent Estimation for PCA and Sparse Regression with Oblivious Outliers

We develop machinery to design efficiently computable and consistent est...
research
06/16/2022

On the well-spread property and its relation to linear regression

We consider the robust linear regression model y = Xβ^* + η, where an ad...
research
10/30/2022

Robust and Tuning-Free Sparse Linear Regression via Square-Root Slope

We consider the high-dimensional linear regression model and assume that...
research
04/12/2019

Outlier-robust estimation of a sparse linear model using ℓ_1-penalized Huber's M-estimator

We study the problem of estimating a p-dimensional s-sparse vector in a ...
research
03/14/2022

The TAP free energy for high-dimensional linear regression

We derive a variational representation for the log-normalizing constant ...
research
09/21/2018

Compressed Sensing with Adversarial Sparse Noise via L1 Regression

We present a simple and effective algorithm for the problem of sparse ro...

Please sign up or login with your details

Forgot password? Click here to reset