Reconsidering Dependency Networks from an Information Geometry Perspective

07/02/2021
by   Kazuya Takabatake, et al.
0

Dependency networks (Heckerman et al., 2000) are potential probabilistic graphical models for systems comprising a large number of variables. Like Bayesian networks, the structure of a dependency network is represented by a directed graph, and each node has a conditional probability table. Learning and inference are realized locally on individual nodes; therefore, computation remains tractable even with a large number of variables. However, the dependency network's learned distribution is the stationary distribution of a Markov chain called pseudo-Gibbs sampling and has no closed-form expressions. This technical disadvantage has impeded the development of dependency networks. In this paper, we consider a certain manifold for each node. Then, we can interpret pseudo-Gibbs sampling as iterative m-projections onto these manifolds. This interpretation provides a theoretical bound for the location where the stationary distribution of pseudo-Gibbs sampling exists in distribution space. Furthermore, this interpretation involves structure and parameter learning algorithms as optimization problems. In addition, we compare dependency and Bayesian networks experimentally. The results demonstrate that the dependency network and the Bayesian network have roughly the same performance in terms of the accuracy of their learned distributions. The results also show that the dependency network can learn much faster than the Bayesian network.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/19/2012

Learning Module Networks

Methods for learning Bayesian network structure can discover dependency ...
research
01/16/2013

Dependency Networks for Collaborative Filtering and Data Visualization

We describe a graphical model for probabilistic relationships---an alter...
research
08/17/2019

Prune Sampling: a MCMC inference technique for discrete and deterministic Bayesian networks

We introduce and characterise the performance of the Markov chain Monte ...
research
10/16/2012

Closed-Form Learning of Markov Networks from Dependency Networks

Markov networks (MNs) are a powerful way to compactly represent a joint ...
research
03/18/2015

GSNs : Generative Stochastic Networks

We introduce a novel training principle for probabilistic models that is...
research
05/31/2019

Component-wise approximate Bayesian computation via Gibbs-like steps

Approximate Bayesian computation methods are useful for generative model...
research
08/09/2021

Identification in Bayesian Estimation of the Skewness Matrix in a Multivariate Skew-Elliptical Distribution

Harvey et al. (2010) extended the Bayesian estimation method by Sahu et ...

Please sign up or login with your details

Forgot password? Click here to reset