Gibbs sampler and coordinate ascent variational inference: a set-theoretical review
A central task in Bayesian machine learning is the approximation of the posterior distribution. Gibbs sampler and coordinate ascent variational inference are renownedly utilized approximation techniques that rely on stochastic and deterministic approximations. This article clarifies that the two schemes can be explained more generally in a set-theoretical point of view. The alternative views are consequences of a duality formula for variational inference.
READ FULL TEXT