Stochastic Annealing for Variational Inference

05/25/2015
by   San Gultekin, et al.
0

We empirically evaluate a stochastic annealing strategy for Bayesian posterior optimization with variational inference. Variational inference is a deterministic approach to approximate posterior inference in Bayesian models in which a typically non-convex objective function is locally optimized over the parameters of the approximating distribution. We investigate an annealing method for optimizing this objective with the aim of finding a better local optimal solution and compare with deterministic annealing methods and no annealing. We show that stochastic annealing can provide clear improvement on the GMM and HMM, while performance on LDA tends to favor deterministic annealing methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/09/2014

Quantum Annealing for Variational Bayes Inference

This paper presents studies on a deterministic annealing algorithm based...
research
05/24/2017

Proximity Variational Inference

Variational inference is a powerful approach for approximate posterior i...
research
11/07/2014

Variational Tempering

Variational inference (VI) combined with data subsampling enables approx...
research
07/10/2023

LINFA: a Python library for variational inference with normalizing flow and annealing

Variational inference is an increasingly popular method in statistics an...
research
11/22/2022

CMOS-compatible Ising and Potts Annealing Using Single Photon Avalanche Diodes

Massively parallel annealing processors may offer superior performance f...
research
07/07/2022

Quantum Advantage in Variational Bayes Inference

Variational Bayes (VB) inference algorithm is used widely to estimate bo...
research
06/12/2018

Deep State Space Models for Unconditional Word Generation

Autoregressive feedback is considered a necessity for successful uncondi...

Please sign up or login with your details

Forgot password? Click here to reset