DeepAI AI Chat
Log In Sign Up

Asymptotic Model Selection for Directed Networks with Hidden Variables

by   Dan Geiger, et al.

We extend the Bayesian Information Criterion (BIC), an asymptotic approximation for the marginal likelihood, to Bayesian networks with hidden variables. This approximation can be used to select models given large samples of data. The standard BIC as well as our extension punishes the complexity of a model according to the dimension of its parameters. We argue that the dimension of a Bayesian network with hidden variables is the rank of the Jacobian matrix of the transformation between the parameters of the network and the parameters of the observable variables. We compute the dimensions of several networks including the naive Bayes model with a hidden root node.


page 1

page 2

page 3

page 4


Automated Analytic Asymptotic Evaluation of the Marginal Likelihood for Latent Models

We present and implement two algorithms for analytic asymptotic evaluati...

Asymptotic Model Selection for Naive Bayesian Networks

We develop a closed form asymptotic formula to compute the marginal like...

Efficient Approximations for the Marginal Likelihood of Incomplete Data Given a Bayesian Network

We discuss Bayesian methods for learning Bayesian networks when data set...

Algebraic Statistics in Model Selection

We develop the necessary theory in computational algebraic geometry to p...

Effective Dimensions of Hierarchical Latent Class Models

Hierarchical latent class (HLC) models are tree-structured Bayesian netw...

Dimension of Marginals of Kronecker Product Models

A Kronecker product model is the set of visible marginal probability dis...

Parsimonious Bayesian deep networks

Combining Bayesian nonparametrics and a forward model selection strategy...