Settling the Robust Learnability of Mixtures of Gaussians

11/06/2020
by   Allen Liu, et al.
0

This work represents a natural coalescence of two important lines of work: learning mixtures of Gaussians and algorithmic robust statistics. In particular we give the first provably robust algorithm for learning mixtures of any constant number of Gaussians. We require only mild assumptions on the mixing weights (bounded fractionality) and that the total variation distance between components is bounded away from zero. At the heart of our algorithm is a new method for proving dimension-independent polynomial identifiability through applying a carefully chosen sequence of differential operations to certain generating functions that not only encode the parameters we would like to learn but also the system of polynomial equations we would like to solve. We show how the symbolic identities we derive can be directly used to analyze a natural sum-of-squares relaxation.

READ FULL TEXT

Authors

page 1

page 2

page 3

page 4

07/12/2020

Robust Learning of Mixtures of Gaussians

We resolve one of the major outstanding problems in robust statistics. I...
05/13/2020

Robustly Learning any Clusterable Mixture of Gaussians

We study the efficient learnability of high-dimensional Gaussian mixture...
12/10/2021

Beyond Parallel Pancakes: Quasi-Polynomial Time Guarantees for Non-Spherical Gaussian Mixtures

We consider mixtures of k≥ 2 Gaussian components with unknown means and ...
05/06/2020

Outlier-Robust Clustering of Non-Spherical Mixtures

We give the first outlier-robust efficient algorithm for clustering a mi...
01/31/2022

On the identifiability of mixtures of ranking models

Mixtures of ranking models are standard tools for ranking problems. Howe...
08/17/2018

Efficiently Learning Mixtures of Mallows Models

Mixtures of Mallows models are a popular generative model for ranking da...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.