Statistical Convergence of the EM Algorithm on Gaussian Mixture Models

10/09/2018
by   Ruofei Zhao, et al.
0

We study the convergence behavior of the Expectation Maximization (EM) algorithm on Gaussian mixture models with an arbitrary number of mixture components and mixing weights. We show that as long as the means of the components are separated by at least Ω(√({M,d})), where M is the number of components and d is the dimension, the EM algorithm converges locally to the global optimum of the log-likelihood. Further, we show that the convergence rate is linear and characterize the size of the basin of attraction to the global optimum.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/12/2018

Global Convergence of EM Algorithm for Mixtures of Two Component Linear Regression

The Expectation-Maximization algorithm is perhaps the most broadly used ...
research
02/02/2020

EM Algorithm is Sample-Optimal for Learning Mixtures of Well-Separated Gaussians

We consider the problem of spherical Gaussian Mixture models with k ≥ 3 ...
research
01/16/2013

Model Selection for Gaussian Mixture Models

This paper is concerned with an important issue in finite mixture modell...
research
08/19/2019

Quantum Expectation-Maximization for Gaussian Mixture Models

The Expectation-Maximization (EM) algorithm is a fundamental tool in uns...
research
04/30/2021

A Riemannian Newton Trust-Region Method for Fitting Gaussian Mixture Models

Gaussian Mixture Models are a powerful tool in Data Science and Statisti...
research
07/08/2019

Comparing EM with GD in Mixture Models of Two Components

The expectation-maximization (EM) algorithm has been widely used in mini...
research
06/27/2012

Convergence of the EM Algorithm for Gaussian Mixtures with Unbalanced Mixing Coefficients

The speed of convergence of the Expectation Maximization (EM) algorithm ...

Please sign up or login with your details

Forgot password? Click here to reset