Variational Inference with Mixture Model Approximation: Robotic Applications

05/23/2019
by   Emmanuel Pignat, et al.
0

We propose a method to approximate the distribution of robot configurations satisfying multiple objectives. Our approach uses Variational Inference, a popular method in Bayesian computation, which has several advantages over sampling-based techniques. To be able to represent the complex and multimodal distribution of configurations, we propose to use a mixture model as approximate distribution, an approach that has gained popularity recently. In this work, we show the interesting properties of this approach and how it can be applied to a range of problems.

READ FULL TEXT

page 3

page 4

page 6

research
06/10/2015

Automatic Variational Inference in Stan

Variational inference is a scalable technique for approximate Bayesian i...
research
10/07/2020

Learning from demonstration using products of experts: applications to manipulation and task prioritization

Probability distributions are key components of many learning from demon...
research
12/29/2021

Variational Learning for the Inverted Beta-Liouville Mixture Model and Its Application to Text Categorization

The finite invert Beta-Liouville mixture model (IBLMM) has recently gain...
research
07/12/2023

Robust scalable initialization for Bayesian variational inference with multi-modal Laplace approximations

For predictive modeling relying on Bayesian inversion, fully independent...
research
06/09/2021

Mixture weights optimisation for Alpha-Divergence Variational Inference

This paper focuses on α-divergence minimisation methods for Variational ...
research
10/27/2022

Deep Latent Mixture Model for Recommendation

Recent advances in neural networks have been successfully applied to man...
research
06/29/2015

Variational Inference for Background Subtraction in Infrared Imagery

We propose a Gaussian mixture model for background subtraction in infrar...

Please sign up or login with your details

Forgot password? Click here to reset