Robustly Learning Mixtures of k Arbitrary Gaussians

by   Ainesh Bakshi, et al.

We give a polynomial-time algorithm for the problem of robustly estimating a mixture of k arbitrary Gaussians in ℝ^d, for any fixed k, in the presence of a constant fraction of arbitrary corruptions. This resolves the main open problem in several previous works on algorithmic robust statistics, which addressed the special cases of robustly estimating (a) a single Gaussian, (b) a mixture of TV-distance separated Gaussians, and (c) a uniform mixture of two Gaussians. Our main tools are an efficient partial clustering algorithm that relies on the sum-of-squares method, and a novel tensor decomposition algorithm that allows errors in both Frobenius norm and low-rank terms.


page 1

page 2

page 3

page 4


Robustly Clustering a Mixture of Gaussians

We give an efficient algorithm for robustly clustering of a mixture of a...

Robustly Learning any Clusterable Mixture of Gaussians

We study the efficient learnability of high-dimensional Gaussian mixture...

A Spectral Algorithm for List-Decodable Covariance Estimation in Relative Frobenius Norm

We study the problem of list-decodable Gaussian covariance estimation. G...

Robust Learning of Mixtures of Gaussians

We resolve one of the major outstanding problems in robust statistics. I...

Outlier-Robust Clustering of Non-Spherical Mixtures

We give the first outlier-robust efficient algorithm for clustering a mi...

Agnostic Estimation of Mean and Covariance

We consider the problem of estimating the mean and covariance of a distr...

Gaussian mixture model decomposition of multivariate signals

We propose a greedy variational method for decomposing a non-negative mu...

Please sign up or login with your details

Forgot password? Click here to reset