Mixture Approximations to Bayesian Networks

01/23/2013
by   Volker Tresp, et al.
0

Structure and parameters in a Bayesian network uniquely specify the probability distribution of the modeled domain. The locality of both structure and probabilistic information are the great benefits of Bayesian networks and require the modeler to only specify local information. On the other hand this locality of information might prevent the modeler - and even more any other person - from obtaining a general overview of the important relationships within the domain. The goal of the work presented in this paper is to provide an "alternative" view on the knowledge encoded in a Bayesian network which might sometimes be very helpful for providing insights into the underlying domain. The basic idea is to calculate a mixture approximation to the probability distribution represented by the Bayesian network. The mixture component densities can be thought of as representing typical scenarios implied by the Bayesian model, providing intuition about the basic relationships. As an additional benefit, performing inference in the approximate model is very simple and intuitive and can provide additional insights. The computational complexity for the calculation of the mixture approximations criticaly depends on the measure which defines the distance between the probability distribution represented by the Bayesian network and the approximate distribution. Both the KL-divergence and the backward KL-divergence lead to inefficient algorithms. Incidentally, the latter is used in recent work on mixtures of mean field solutions to which the work presented here is closely related. We show, however, that using a mean squared error cost function leads to update equations which can be solved using the junction tree algorithm. We conclude that the mean squared error cost function can be used for Bayesian networks in which inference based on the junction tree is tractable. For large networks, however, one may have to rely on mean field approximations.

READ FULL TEXT

page 1

page 5

page 6

page 7

research
03/29/2018

Copula Variational Bayes inference via information geometry

Variational Bayes (VB), also known as independent mean-field approximati...
research
12/16/2021

Marginalization in Bayesian Networks: Integrating Exact and Approximate Inference

Bayesian Networks are probabilistic graphical models that can compactly ...
research
01/16/2013

Dynamic Trees: A Structured Variational Method Giving Efficient Propagation Rules

Dynamic trees are mixtures of tree structured belief networks. They solv...
research
05/09/2012

Mean Field Variational Approximation for Continuous-Time Bayesian Networks

Continuous-time Bayesian networks is a natural structured representation...
research
09/09/2009

Structure Variability in Bayesian Networks

The structure of a Bayesian network encodes most of the information abou...
research
11/23/2016

Multi-Modal Mean-Fields via Cardinality-Based Clamping

Mean Field inference is central to statistical physics. It has attracted...
research
04/10/2018

Mean Field Network based Graph Refinement with application to Airway Tree Extraction

We present tree extraction in 3D images as a graph refinement task, of o...

Please sign up or login with your details

Forgot password? Click here to reset