Incorporating Expressive Graphical Models in Variational Approximations: Chain-Graphs and Hidden Variables

by   Tal El-Hay, et al.

Global variational approximation methods in graphical models allow efficient approximate inference of complex posterior distributions by using a simpler model. The choice of the approximating model determines a tradeoff between the complexity of the approximation procedure and the quality of the approximation. In this paper, we consider variational approximations based on two classes of models that are richer than standard Bayesian networks, Markov networks or mixture models. As such, these classes allow to find better tradeoffs in the spectrum of approximations. The first class of models are chain graphs, which capture distributions that are partially directed. The second class of models are directed graphs (Bayesian networks) with additional latent variables. Both classes allow representation of multi-variable dependencies that cannot be easily represented within a Bayesian network.


page 1

page 3

page 4

page 5

page 6


Markov Properties for Graphical Models with Cycles and Latent Variables

We investigate probabilistic graphical models that allow for both cycles...

Customised Structural Elicitation

Established methods for structural elicitation typically rely on code mo...

SIReN-VAE: Leveraging Flows and Amortized Inference for Bayesian Networks

Initial work on variational autoencoders assumed independent latent vari...

On the Geometry of Bayesian Graphical Models with Hidden Variables

In this paper we investigate the geometry of the likelihood of the unkno...

Bayesian Diagnostics for Chain Event Graphs

Chain event graphs have been established as a practical Bayesian graphic...

IPF for Discrete Chain Factor Graphs

Iterative Proportional Fitting (IPF), combined with EM, is commonly used...

A Variational Approximation for Bayesian Networks with Discrete and Continuous Latent Variables

We show how to use a variational approximation to the logistic function ...