Higher degree sum-of-squares relaxations robust against oblivious outliers

11/14/2022
by   Tommaso d'Orsi, et al.
0

We consider estimation models of the form Y=X^*+N, where X^* is some m-dimensional signal we wish to recover, and N is symmetrically distributed noise that may be unbounded in all but a small α fraction of the entries. We introduce a family of algorithms that under mild assumptions recover the signal X^* in all estimation problems for which there exists a sum-of-squares algorithm that succeeds in recovering the signal X^* when the noise N is Gaussian. This essentially shows that it is enough to design a sum-of-squares algorithm for an estimation problem with Gaussian noise in order to get the algorithm that works with the symmetric noise model. Our framework extends far beyond previous results on symmetric noise models and is even robust to adversarial perturbations. As concrete examples, we investigate two problems for which no efficient algorithms were known to work for heavy-tailed noise: tensor PCA and sparse PCA. For the former, our algorithm recovers the principal component in polynomial time when the signal-to-noise ratio is at least Õ(n^p/4/α), that matches (up to logarithmic factors) current best known algorithmic guarantees for Gaussian noise. For the latter, our algorithm runs in quasipolynomial time and matches the state-of-the-art guarantees for quasipolynomial time algorithms in the case of Gaussian noise. Using a reduction from the planted clique problem, we provide evidence that the quasipolynomial time is likely to be necessary for sparse PCA with symmetric noise. In our proofs we use bounds on the covering numbers of sets of pseudo-expectations, which we obtain by certifying in sum-of-squares upper bounds on the Gaussian complexities of sets of solutions. This approach for bounding the covering numbers of sets of pseudo-expectations may be interesting in its own right and may find other application in future works.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/11/2021

The Complexity of Sparse Tensor PCA

We study the problem of sparse tensor principal component analysis: give...
research
11/12/2020

Sparse PCA: Algorithms, Adversarial Perturbations and Certificates

We study efficient algorithms for Sparse PCA in standard statistical mod...
research
02/20/2023

Sparse PCA Beyond Covariance Thresholding

In the Wishart model for sparse PCA we are given n samples Y_1,…, Y_n dr...
research
07/26/2019

Subexponential-Time Algorithms for Sparse PCA

We study the computational cost of recovering a unit-norm sparse princip...
research
12/22/2016

Statistical limits of spiked tensor models

We study the statistical limits of both detecting and estimating a rank-...
research
10/28/2016

Homotopy Analysis for Tensor PCA

Developing efficient and guaranteed nonconvex algorithms has been an imp...
research
01/05/2021

SoS Degree Reduction with Applications to Clustering and Robust Moment Estimation

We develop a general framework to significantly reduce the degree of sum...

Please sign up or login with your details

Forgot password? Click here to reset