A Rigorous Link Between Self-Organizing Maps and Gaussian Mixture Models

09/24/2020
by   Alexander Gepperth, et al.
0

This work presents a mathematical treatment of the relation between Self-Organizing Maps (SOMs) and Gaussian Mixture Models (GMMs). We show that energy-based SOM models can be interpreted as performing gradient descent, minimizing an approximation to the GMM log-likelihood that is particularly valid for high data dimensionalities. The SOM-like decrease of the neighborhood radius can be understood as an annealing procedure ensuring that gradient descent does not get stuck in undesirable local minima. This link allows to treat SOMs as generative probabilistic models, giving a formal justification for using SOMs, e.g., to detect outliers, or for sampling.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 8

12/18/2019

Gradient-based training of Gaussian Mixture Models in High-Dimensional Spaces

We present an approach for efficiently training Gaussian Mixture Models ...
11/02/2020

Fast Reinforcement Learning with Incremental Gaussian Mixture Models

This work presents a novel algorithm that integrates a data-efficient fu...
09/28/2020

Likelihood Landscape and Local Minima Structures of Gaussian Mixture Models

In this paper, we study the landscape of the population negative log-lik...
11/15/2017

Sliced Wasserstein Distance for Learning Gaussian Mixture Models

Gaussian mixture models (GMM) are powerful parametric tools with many ap...
12/13/2018

Automatic Differentiation in Mixture Models

In this article, we discuss two specific classes of models - Gaussian Mi...
09/21/2018

From 2D to 3D Geodesic-based Garment Matching

A new approach for 2D to 3D garment retexturing is proposed based on Gau...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.