Robust scalable initialization for Bayesian variational inference with multi-modal Laplace approximations

07/12/2023
by   Wyatt Bridgman, et al.
0

For predictive modeling relying on Bayesian inversion, fully independent, or “mean-field”, Gaussian distributions are often used as approximate probability density functions in variational inference since the number of variational parameters is twice the number of unknown model parameters. The resulting diagonal covariance structure coupled with unimodal behavior can be too restrictive when dealing with highly non-Gaussian behavior, including multimodality. High-fidelity surrogate posteriors in the form of Gaussian mixtures can capture any distribution to an arbitrary degree of accuracy while maintaining some analytical tractability. Variational inference with Gaussian mixtures with full-covariance structures suffers from a quadratic growth in variational parameters with the number of model parameters. Coupled with the existence of multiple local minima due to nonconvex trends in the loss functions often associated with variational inference, these challenges motivate the need for robust initialization procedures to improve the performance and scalability of variational inference with mixture models. In this work, we propose a method for constructing an initial Gaussian mixture model approximation that can be used to warm-start the iterative solvers for variational inference. The procedure begins with an optimization stage in model parameter space in which local gradient-based optimization, globalized through multistart, is used to determine a set of local maxima, which we take to approximate the mixture component centers. Around each mode, a local Gaussian approximation is constructed via the Laplace method. Finally, the mixture weights are determined through constrained least squares regression. Robustness and scalability are demonstrated using synthetic tests. The methodology is applied to an inversion problem in structural dynamics involving unknown viscous damping coefficients.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/01/2021

Transformation Models for Flexible Posteriors in Variational Bayes

The main challenge in Bayesian models is to determine the posterior for ...
research
05/04/2021

Variational Inference and Sparsity in High-Dimensional Deep Gaussian Mixture Models

Gaussian mixture models are a popular tool for model-based clustering, a...
research
05/23/2019

Variational Inference with Mixture Model Approximation: Robotic Applications

We propose a method to approximate the distribution of robot configurati...
research
01/30/2019

Metric Gaussian Variational Inference

A variational Gaussian approximation of the posterior distribution can b...
research
04/27/2022

Variational Kalman Filtering with Hinf-Based Correction for Robust Bayesian Learning in High Dimensions

In this paper, we address the problem of convergence of sequential varia...
research
09/23/2022

A Unified Perspective on Natural Gradient Variational Inference with Gaussian Mixture Models

Variational inference with Gaussian mixture models (GMMs) enables learni...
research
06/29/2015

Variational Inference for Background Subtraction in Infrared Imagery

We propose a Gaussian mixture model for background subtraction in infrar...

Please sign up or login with your details

Forgot password? Click here to reset