Gibbs sampler and coordinate ascent variational inference: a set-theoretical review

08/03/2020
by   Se Yoon Lee, et al.
0

A central task in Bayesian machine learning is the approximation of the posterior distribution. Gibbs sampler and coordinate ascent variational inference are renownedly utilized approximation techniques that rely on stochastic and deterministic approximations. This article clarifies that the two schemes can be explained more generally in a set-theoretical point of view. The alternative views are consequences of a duality formula for variational inference.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset