Trust-Region Variational Inference with Gaussian Mixture Models

07/10/2019
by   Oleg Arenz, et al.
5

Many methods for machine learning rely on approximate inference from intractable probability distributions. Variational inference approximates such distributions by tractable models that can be subsequently used for approximate inference. Learning sufficiently accurate approximations requires a rich model family and careful exploration of the relevant modes of the target distribution. We propose a method for learning accurate GMM approximations of intractable probability distributions based on insights from policy search by establishing information-geometric trust regions for principled exploration. For efficient improvement of the GMM approximation, we derive a lower bound on the corresponding optimization objective enabling us to update the components independently. The use of the lower bound ensures convergence to a local optimum of the original objective. The number of components is adapted online by adding new components in promising regions and by deleting components with negligible weight. We demonstrate on several domains that we can learn approximations of complex, multi-modal distributions with a quality that is unmet by previous variational inference methods, and that the GMM approximation can be used for drawing samples that are on par with samples created by state-of-the-art MCMC samplers while requiring up to three orders of magnitude less computational resources.

READ FULL TEXT

page 23

page 27

page 29

page 34

research
09/23/2022

A Unified Perspective on Natural Gradient Variational Inference with Gaussian Mixture Models

Variational inference with Gaussian mixture models (GMMs) enables learni...
research
03/21/2017

A Deterministic Global Optimization Method for Variational Inference

Variational inference methods for latent variable statistical models hav...
research
02/27/2017

Approximate Inference with Amortised MCMC

We propose a novel approximate inference algorithm that approximates a t...
research
10/07/2020

Learning from demonstration using products of experts: applications to manipulation and task prioritization

Probability distributions are key components of many learning from demon...
research
10/18/2021

Interpolating between sampling and variational inference with infinite stochastic mixtures

Sampling and Variational Inference (VI) are two large families of method...
research
10/19/2012

A Generalized Mean Field Algorithm for Variational Inference in Exponential Families

The mean field methods, which entail approximating intractable probabili...
research
07/01/2020

All in the Exponential Family: Bregman Duality in Thermodynamic Variational Inference

The recently proposed Thermodynamic Variational Objective (TVO) leverage...

Please sign up or login with your details

Forgot password? Click here to reset