A Unified Perspective on Natural Gradient Variational Inference with Gaussian Mixture Models

09/23/2022
by   Oleg Arenz, et al.
0

Variational inference with Gaussian mixture models (GMMs) enables learning of highly-tractable yet multi-modal approximations of intractable target distributions. GMMs are particular relevant for problem settings with up to a few hundred dimensions, for example in robotics, for modelling distributions over trajectories or joint distributions. This work focuses on two very effective methods for GMM-based variational inference that both employ independent natural gradient updates for the individual components and the categorical distribution of the weights. We show for the first time, that their derived updates are equivalent, although their practical implementations and theoretical guarantees differ. We identify several design choices that distinguish both approaches, namely with respect to sample selection, natural gradient estimation, stepsize adaptation, and whether trust regions are enforced or the number of components adapted. We perform extensive ablations on these design choices and show that they strongly affect the efficiency of the optimization and the variability of the learned distribution. Based on our insights, we propose a novel instantiation of our generalized framework, that combines first-order natural gradient estimates with trust-regions and component adaption, and significantly outperforms both previous methods in all our experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/10/2019

Trust-Region Variational Inference with Gaussian Mixture Models

Many methods for machine learning rely on approximate inference from int...
research
12/01/2021

An adaptive mixture-population Monte Carlo method for likelihood-free inference

This paper focuses on variational inference with intractable likelihood ...
research
02/26/2022

Variational Inference with Gaussian Mixture by Entropy Approximation

Variational inference is a technique for approximating intractable poste...
research
07/12/2023

Robust scalable initialization for Bayesian variational inference with multi-modal Laplace approximations

For predictive modeling relying on Bayesian inversion, fully independent...
research
10/07/2020

Learning from demonstration using products of experts: applications to manipulation and task prioritization

Probability distributions are key components of many learning from demon...
research
01/13/2017

Truncation-free Hybrid Inference for DPMM

Dirichlet process mixture models (DPMM) are a cornerstone of Bayesian no...
research
08/23/2020

Blindness of score-based methods to isolated components and mixing proportions

A large family of score-based methods are developed recently to solve un...

Please sign up or login with your details

Forgot password? Click here to reset