Beyond Parallel Pancakes: Quasi-Polynomial Time Guarantees for Non-Spherical Gaussian Mixtures

12/10/2021
by   Rares-Darius Buhai, et al.
0

We consider mixtures of k≥ 2 Gaussian components with unknown means and unknown covariance (identical for all components) that are well-separated, i.e., distinct components have statistical overlap at most k^-C for a large enough constant C≥ 1. Previous statistical-query lower bounds [DKS17] give formal evidence that even distinguishing such mixtures from (pure) Gaussians may be exponentially hard (in k). We show that this kind of hardness can only appear if mixing weights are allowed to be exponentially small, and that for polynomially lower bounded mixing weights non-trivial algorithmic guarantees are possible in quasi-polynomial time. Concretely, we develop an algorithm based on the sum-of-squares method with running time quasi-polynomial in the minimum mixing weight. The algorithm can reliably distinguish between a mixture of k≥ 2 well-separated Gaussian components and a (pure) Gaussian distribution. As a certificate, the algorithm computes a bipartition of the input sample that separates a pair of mixture components, i.e., both sides of the bipartition contain most of the sample points of at least one component. For the special case of colinear means, our algorithm outputs a k clustering of the input sample that is approximately consistent with the components of the mixture. A significant challenge for our results is that they appear to be inherently sensitive to small fractions of adversarial outliers unlike most previous results for Gaussian mixtures. The reason is that such outliers can simulate exponentially small mixing weights even for mixtures with polynomially lower bounded mixing weights. A key technical ingredient is a characterization of separating directions for well-separated Gaussian components in terms of ratios of polynomials that correspond to moments of two carefully chosen orders logarithmic in the minimum mixing weight.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/20/2017

Mixture Models, Robustness, and Sum of Squares Proofs

We use the Sum of Squares method to develop new efficient algorithms for...
research
05/13/2020

Robustly Learning any Clusterable Mixture of Gaussians

We study the efficient learnability of high-dimensional Gaussian mixture...
research
02/02/2020

EM Algorithm is Sample-Optimal for Learning Mixtures of Well-Separated Gaussians

We consider the problem of spherical Gaussian Mixture models with k ≥ 3 ...
research
06/22/2023

SQ Lower Bounds for Learning Bounded Covariance GMMs

We study the complexity of learning mixtures of separated Gaussians with...
research
11/06/2020

Settling the Robust Learnability of Mixtures of Gaussians

This work represents a natural coalescence of two important lines of wor...
research
10/04/2021

Clustering a Mixture of Gaussians with Unknown Covariance

We investigate a clustering problem with data from a mixture of Gaussian...
research
12/08/2020

Algorithms for finding k in k-means

k-means Clustering requires as input the exact value of k, the number of...

Please sign up or login with your details

Forgot password? Click here to reset