List-Decodable Subspace Recovery via Sum-of-Squares

02/12/2020
by   Ainesh Bakshi, et al.
0

We give the first efficient algorithm for the problem of list-decodable subspace recovery. Our algorithm takes input n samples α n (α≪ 1/2) are generated i.i.d. from Gaussian distribution N(0,Σ_*) on R^d with covariance Σ_* of rank r and the rest are arbitrary, potentially adversarial outliers. It outputs a list of O(1/α) projection matrices guaranteed to contain a projection matrix Π such that Π-Π_*_F^2 = κ^4 log (r) Õ(1/α^2), where Õ hides polylogarithmic factors in 1/α. Here, Π_* is the projection matrix to the range space of Σ_*. The algorithm needs n=d^log (r κ) Õ(1/α^2) samples and runs in time n^log (r κ) Õ(1/α^4) time where κ is the ratio of the largest to smallest non-zero eigenvalues of Σ_*. Our algorithm builds on the recently developed framework for list-decodable learning via the sum-of-squares (SoS) method [KKK'19, RY'20] with some key technical and conceptual advancements. Our key conceptual contribution involves showing a (SoS "certified") lower bound on the eigenvalues of covariances of arbitrary small subsamples of an i.i.d. sample of a certifiably anti-concentrated distribution. One of our key technical contributions gives a new method that allows error reduction "within SoS" with only a logarithmic cost in the exponent in the running time (in contrast to polynomial cost in [KKK'19, RY'20]. In a concurrent and independent work, Raghavendra and Yau proved related results for list-decodable subspace recovery [RY'20].

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/22/2022

List-Decodable Covariance Estimation

We give the first polynomial time algorithm for list-decodable covarianc...
research
05/14/2019

List-Decodable Linear Regression

We give the first polynomial-time algorithm for robust regression in the...
research
02/07/2020

List Decodable Subspace Recovery

Learning from data in the presence of outliers is a fundamental problem ...
research
05/12/2019

List Decodable Learning via Sum of Squares

In the list-decodable learning setup, an overwhelming majority (say a 1-...
research
11/19/2020

List-Decodable Mean Estimation in Nearly-PCA Time

Traditionally, robust statistics has focused on designing estimators tol...
research
05/01/2023

A Spectral Algorithm for List-Decodable Covariance Estimation in Relative Frobenius Norm

We study the problem of list-decodable Gaussian covariance estimation. G...
research
11/11/2022

Re-Analyze Gauss: Bounds for Private Matrix Approximation via Dyson Brownian Motion

Given a symmetric matrix M and a vector λ, we present new bounds on the ...

Please sign up or login with your details

Forgot password? Click here to reset