Optimal Robust Linear Regression in Nearly Linear Time
We study the problem of high-dimensional robust linear regression where a learner is given access to n samples from the generative model Y = ⟨ X,w^* ⟩ + ϵ (with X ∈ℝ^d and ϵ independent), in which an η fraction of the samples have been adversarially corrupted. We propose estimators for this problem under two settings: (i) X is L4-L2 hypercontractive, 𝔼 [XX^⊤] has bounded condition number and ϵ has bounded variance and (ii) X is sub-Gaussian with identity second moment and ϵ is sub-Gaussian. In both settings, our estimators: (a) Achieve optimal sample complexities and recovery guarantees up to log factors and (b) Run in near linear time (Õ(nd / η^6)). Prior to our work, polynomial time algorithms achieving near optimal sample complexities were only known in the setting where X is Gaussian with identity covariance and ϵ is Gaussian, and no linear time estimators were known for robust linear regression in any setting. Our estimators and their analysis leverage recent developments in the construction of faster algorithms for robust mean estimation to improve runtimes, and refined concentration of measure arguments alongside Gaussian rounding techniques to improve statistical sample complexities.
READ FULL TEXT