DeepAI
Log In Sign Up

A Riemannian Newton Trust-Region Method for Fitting Gaussian Mixture Models

04/30/2021
by   Lena Sembach, et al.
0

Gaussian Mixture Models are a powerful tool in Data Science and Statistics that are mainly used for clustering and density approximation. The task of estimating the model parameters is in practice often solved by the Expectation Maximization (EM) algorithm which has its benefits in its simplicity and low per-iteration costs. However, the EM converges slowly if there is a large share of hidden information or overlapping clusters. Recent advances in Manifold Optimization for Gaussian Mixture Models have gained increasing interest. We introduce a formula for the Riemannian Hessian for Gaussian Mixture Models. On top, we propose a new Riemannian Newton Trust-Region method which outperforms current approaches both in terms of runtime and number of iterations.

READ FULL TEXT
10/09/2018

Statistical Convergence of the EM Algorithm on Gaussian Mixture Models

We study the convergence behavior of the Expectation Maximization (EM) a...
06/25/2015

Manifold Optimization for Gaussian Mixture Models

We take a new look at parameter estimation for Gaussian Mixture Models (...
04/11/2019

Direct Fitting of Gaussian Mixture Models

When fitting Gaussian Mixture Models to 3D geometry, the model is typica...
11/18/2020

Surrogate modeling approximation using a mixture of experts based on EM joint estimation

An automatic method to combine several local surrogate models is present...
12/20/2022

An Adaptive Covariance Parameterization Technique for the Ensemble Gaussian Mixture Filter

The ensemble Gaussian mixture filter combines the simplicity and power o...
06/22/2020

Deep Residual Mixture Models

We propose Deep Residual Mixture Models (DRMMs) which share the many des...
05/21/2018

A universal framework for learning based on the elliptical mixture model (EMM)

An increasing prominence of unbalanced and noisy data highlights the imp...