Clustering Mixtures with Almost Optimal Separation in Polynomial Time

by   Jerry Li, et al.

We consider the problem of clustering mixtures of mean-separated Gaussians in high dimensions. We are given samples from a mixture of k identity covariance Gaussians, so that the minimum pairwise distance between any two pairs of means is at least Δ, for some parameter Δ > 0, and the goal is to recover the ground truth clustering of these samples. It is folklore that separation Δ = Θ (√(log k)) is both necessary and sufficient to recover a good clustering, at least information theoretically. However, the estimators which achieve this guarantee are inefficient. We give the first algorithm which runs in polynomial time, and which almost matches this guarantee. More precisely, we give an algorithm which takes polynomially many samples and time, and which can successfully recover a good clustering, so long as the separation is Δ = Ω (log^1/2 + c k), for any c > 0. Previously, polynomial time algorithms were only known for this problem when the separation was polynomial in k, and all algorithms which could tolerate ( log k ) separation required quasipolynomial time. We also extend our result to mixtures of translations of a distribution which satisfies the Poincaré inequality, under additional mild assumptions. Our main technical tool, which we believe is of independent interest, is a novel way to implicitly represent and estimate high degree moments of a distribution, which allows us to extract important information about high-degree moments without ever writing down the full moment tensors explicitly.



page 1

page 2

page 3

page 4


List-Decodable Robust Mean Estimation and Learning Mixtures of Spherical Gaussians

We study the problem of list-decodable Gaussian mean estimation and the ...

On Learning Mixtures of Well-Separated Gaussians

We consider the problem of efficiently learning mixtures of a large numb...

Robustly Learning any Clusterable Mixture of Gaussians

We study the efficient learnability of high-dimensional Gaussian mixture...

List-Decodable Mean Estimation in Nearly-PCA Time

Traditionally, robust statistics has focused on designing estimators tol...

EM Algorithm is Sample-Optimal for Learning Mixtures of Well-Separated Gaussians

We consider the problem of spherical Gaussian Mixture models with k ≥ 3 ...

Algorithms for finding k in k-means

k-means Clustering requires as input the exact value of k, the number of...

Learning a mixture of two subspaces over finite fields

We study the problem of learning a mixture of two subspaces over 𝔽_2^n. ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.