Expectation-Maximization for Adaptive Mixture Models in Graph Optimization

11/12/2018
by   Tim Pfeifer, et al.
0

Non-Gaussian and multimodal distributions are an important part of many recent robust sensor fusion algorithms. In difference to robust cost functions, they are probabilistically founded and have good convergence properties. Since their robustness depends on a close approximation of the real error distribution, their parametrization is crucial. We propose a novel approach that allows to adapt a multi-modal Gaussian mixture model to the error distribution of a sensor fusion problem. By applying expectation-maximization, we are able to provide a computationally efficient solution with well-behaved convergence properties. We demonstrate the performance of these algorithms on several real-world GNSS and indoor localization datasets. The proposed self-tuning mixture algorithm outperforms state of the art approaches with static parametrization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/30/2019

Incrementally Learned Mixture Models for GNSS Localization

GNSS localization is an important part of today's autonomous systems, al...
research
04/22/2019

PLUME: Polyhedral Learning Using Mixture of Experts

In this paper, we propose a novel mixture of expert architecture for lea...
research
12/28/2016

Superpixel Segmentation Using Gaussian Mixture Model

Superpixel segmentation algorithms are to partition an image into percep...
research
02/14/2019

Exponentially-Modified Gaussian Mixture Model: Applications in Spectroscopy

We propose a novel exponentially-modified Gaussian (EMG) mixture residua...
research
06/16/2021

Regularization of Mixture Models for Robust Principal Graph Learning

A regularized version of Mixture Models is proposed to learn a principal...
research
12/09/2020

Conjugate Mixture Models for Clustering Multimodal Data

The problem of multimodal clustering arises whenever the data are gather...

Please sign up or login with your details

Forgot password? Click here to reset