Improved Convergence Guarantees for Learning Gaussian Mixture Models by EM and Gradient EM

01/03/2021
by   Nimrod Segol, et al.
0

We consider the problem of estimating the parameters a Gaussian Mixture Model with K components of known weights, all with an identity covariance matrix. We make two contributions. First, at the population level, we present a sharper analysis of the local convergence of EM and gradient EM, compared to previous works. Assuming a separation of Ω(√(log K)), we prove convergence of both methods to the global optima from an initialization region larger than those of previous works. Specifically, the initial guess of each component can be as far as (almost) half its distance to the nearest Gaussian. This is essentially the largest possible contraction region. Our second contribution are improved sample size requirements for accurate estimation by EM and gradient EM. In previous works, the required number of samples had a quadratic dependence on the maximal separation between the K components, and the resulting error estimate increased linearly with this maximal separation. In this manuscript we show that both quantities depend only logarithmically on the maximal separation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/02/2020

EM Algorithm is Sample-Optimal for Learning Mixtures of Well-Separated Gaussians

We consider the problem of spherical Gaussian Mixture models with k ≥ 3 ...
research
11/21/2017

Parameter Estimation in Gaussian Mixture Models with Malicious Noise, without Balanced Mixing Coefficients

We consider the problem of estimating means of two Gaussians in a 2-Gaus...
research
08/28/2019

Randomly initialized EM algorithm for two-component Gaussian mixture achieves near optimality in O(√(n)) iterations

We analyze the classical EM algorithm for parameter estimation in the sy...
research
02/15/2022

On the Role of Channel Capacity in Learning Gaussian Mixture Models

This paper studies the sample complexity of learning the k unknown cente...
research
07/25/2020

Fair Marriage Principle and Initialization Map for the EM Algorithm

The popular convergence theory of the EM algorithm explains that the obs...
research
03/29/2021

The EM Algorithm is Adaptively-Optimal for Unbalanced Symmetric Gaussian Mixtures

This paper studies the problem of estimating the means ±θ_*∈ℝ^d of a sym...
research
08/07/2016

Statistical Guarantees for Estimating the Centers of a Two-component Gaussian Mixture by EM

Recently, a general method for analyzing the statistical accuracy of the...

Please sign up or login with your details

Forgot password? Click here to reset