On the Convergence of Coordinate Ascent Variational Inference

06/01/2023
by   Anirban Bhattacharya, et al.
0

As a computational alternative to Markov chain Monte Carlo approaches, variational inference (VI) is becoming more and more popular for approximating intractable posterior distributions in large-scale Bayesian models due to its comparable efficacy and superior efficiency. Several recent works provide theoretical justifications of VI by proving its statistical optimality for parameter estimation under various settings; meanwhile, formal analysis on the algorithmic convergence aspects of VI is still largely lacking. In this paper, we consider the common coordinate ascent variational inference (CAVI) algorithm for implementing the mean-field (MF) VI towards optimizing a Kullback–Leibler divergence objective functional over the space of all factorized distributions. Focusing on the two-block case, we analyze the convergence of CAVI by leveraging the extensive toolbox from functional analysis and optimization. We provide general conditions for certifying global or local exponential convergence of CAVI. Specifically, a new notion of generalized correlation for characterizing the interaction between the constituting blocks in influencing the VI objective functional is introduced, which according to the theory, quantifies the algorithmic contraction rate of two-block CAVI. As illustrations, we apply the developed theory to a number of examples, and derive explicit problem-dependent upper bounds on the algorithmic contraction rate.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/23/2022

Robust Coordinate Ascent Variational Inference with Markov chain Monte Carlo simulations

Variational Inference (VI) is a method that approximates a difficult-to-...
research
07/13/2020

Dynamics of coordinate ascent variational inference: A case study in 2D Ising models

Variational algorithms have gained prominence over the past two decades ...
research
10/25/2020

Statistical optimality and stability of tangent transform algorithms in logit models

A systematic approach to finding variational approximation in an otherwi...
research
07/17/2022

Mean field Variational Inference via Wasserstein Gradient Flow

Variational inference (VI) provides an appealing alternative to traditio...
research
10/14/2020

Flexible mean field variational inference using mixtures of non-overlapping exponential families

Sparse models are desirable for many applications across diverse domains...
research
09/29/2022

Structured Optimal Variational Inference for Dynamic Latent Space Models

We consider a latent space model for dynamic networks, where our objecti...
research
02/01/2022

AdaAnn: Adaptive Annealing Scheduler for Probability Density Approximation

Approximating probability distributions can be a challenging task, parti...

Please sign up or login with your details

Forgot password? Click here to reset