Model Selection for Gaussian Mixture Models

01/16/2013
by   Tao Huang, et al.
0

This paper is concerned with an important issue in finite mixture modelling, the selection of the number of mixing components. We propose a new penalized likelihood method for model selection of finite multivariate Gaussian mixture models. The proposed method is shown to be statistically consistent in determining of the number of components. A modified EM algorithm is developed to simultaneously select the number of components and to estimate the mixing weights, i.e. the mixing probabilities, and unknown parameters of Gaussian distributions. Simulations and a real data analysis are presented to illustrate the performance of the proposed method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/09/2018

Statistical Convergence of the EM Algorithm on Gaussian Mixture Models

We study the convergence behavior of the Expectation Maximization (EM) a...
research
12/19/2020

Robust mixture regression with Exponential Power distribution

Assuming an exponential power distribution is one way to deal with outli...
research
12/24/2018

Model Selection for Mixture Models - Perspectives and Strategies

Determining the number G of components in a finite mixture distribution ...
research
04/07/2020

Repulsive Mixture Models of Exponential Family PCA for Clustering

The mixture extension of exponential family principal component analysis...
research
02/23/2023

Detecting Signs of Model Change with Continuous Model Selection Based on Descriptive Dimensionality

We address the issue of detecting changes of models that lie behind a da...
research
06/05/2021

Network Estimation by Mixing: Adaptivity and More

Networks analysis has been commonly used to study the interactions betwe...
research
06/05/2021

Sparsification for Sums of Exponentials and its Algorithmic Applications

Many works in signal processing and learning theory operate under the as...

Please sign up or login with your details

Forgot password? Click here to reset