Outlier-Robust Clustering of Non-Spherical Mixtures

05/06/2020
by   Ainesh Bakshi, et al.
0

We give the first outlier-robust efficient algorithm for clustering a mixture of k statistically separated d-dimensional Gaussians (k-GMMs). Concretely, our algorithm takes input an ϵ-corrupted sample from a k-GMM and outputs an approximate clustering that misclassifies at most O(kϵ)+η fraction of the points whenever every pair of components is separated by 1-(-poly(k/η)) in total variation distance. This is the statistically weakest possible notion of separation and allows, for e.g., clustering of mixtures with components with the same mean with covariances differing in a single unknown direction or separated in Frobenius distance. The running time of our algorithm is d^O(log(κ)) poly(k/η) where κ is a measure of spread of the mixture in any direction. For k=2, our algorithms run in time and samples poly(d) with no dependence on the spread κ. Such a results were not known prior to our work, even for k=2. More generally, our algorithm succeeds for mixtures of any distribution that satisfies two well-studied analytic assumptions - certifiable hypercontractivity and anti-concentration. Thus, they extend to clustering mixtures of arbitrary affine transforms of the uniform distribution on the d-dimensional unit sphere. Even the information theoretic clusterability of distributions satisfying our analytic assumptions was not known and is likely to be of independent interest. Our algorithms build on the recent flurry of work relying on certifiable anti-concentration, first introduced in [KKK'19, RY'20]. Our techniques expand the sum-of-squares toolkit to show robust certifiability of TV-separated Gaussian clusters in data. This involves a low-degree sum-of-squares proof of statements that relate parameter distance to total variation distance simply relying on hypercontractivity and anti-concentration.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/13/2020

Robustly Learning any Clusterable Mixture of Gaussians

We study the efficient learnability of high-dimensional Gaussian mixture...
research
06/22/2022

List-Decodable Covariance Estimation

We give the first polynomial time algorithm for list-decodable covarianc...
research
11/20/2017

Mixture Models, Robustness, and Sum of Squares Proofs

We use the Sum of Squares method to develop new efficient algorithms for...
research
07/12/2020

Robust Learning of Mixtures of Gaussians

We resolve one of the major outstanding problems in robust statistics. I...
research
02/23/2023

Beyond Moments: Robustly Learning Affine Transformations with Asymptotically Optimal Error

We present a polynomial-time algorithm for robustly learning an unknown ...
research
12/03/2020

Robustly Learning Mixtures of k Arbitrary Gaussians

We give a polynomial-time algorithm for the problem of robustly estimati...
research
11/26/2019

Robustly Clustering a Mixture of Gaussians

We give an efficient algorithm for robustly clustering of a mixture of a...

Please sign up or login with your details

Forgot password? Click here to reset