
Perfect TreeLike Markovian Distributions
We show that if a strictly positive joint probability distribution for a...
read it

Embedded Bayesian Network Classifiers
Lowdimensional probability models for local distribution functions in a...
read it

Asymptotic Model Selection for Directed Networks with Hidden Variables
We extend the Bayesian Information Criterion (BIC), an asymptotic approx...
read it

A Bayesian Multiresolution Independence Test for Continuous Variables
In this paper we present a method ofcomputing the posterior probability ...
read it

On the Parameterized Complexity of Polytree Learning
A Bayesian network is a directed acyclic graph that represents statistic...
read it

Learning the Dimensionality of Hidden Variables
A serious problem in learning probabilistic models is the presence of hi...
read it

Bayesian Tensor Factorisation for Bottomup Hidden Tree Markov Models
BottomUp Hidden Tree Markov Model is a highly expressive model for tree...
read it
On Testing Whether an Embedded Bayesian Network Represents a Probability Model
Testing the validity of probabilistic models containing unmeasured (hidden) variables is shown to be a hard task. We show that the task of testing whether models are structurally incompatible with the data at hand, requires an exponential number of independence evaluations, each of the form: "X is conditionally independent of Y, given Z." In contrast, a linear number of such evaluations is required to test a standard Bayesian network (one per vertex). On the positive side, we show that if a network with hidden variables G has a tree skeleton, checking whether G represents a given probability model P requires the polynomial number of such independence evaluations. Moreover, we provide an algorithm that efficiently constructs a treestructured Bayesian network (with hidden variables) that represents P if such a network exists, and further recognizes when such a network does not exist.
READ FULL TEXT
Comments
There are no comments yet.