On the Use of Skeletons when Learning in Bayesian Networks

by   Harald Steck, et al.

In this paper, we present a heuristic operator which aims at simultaneously optimizing the orientations of all the edges in an intermediate Bayesian network structure during the search process. This is done by alternating between the space of directed acyclic graphs (DAGs) and the space of skeletons. The found orientations of the edges are based on a scoring function rather than on induced conditional independences. This operator can be used as an extension to commonly employed search strategies. It is evaluated in experiments with artificial and real-world data.



page 1

page 2

page 3

page 4


Learning Equivalence Classes of Bayesian Networks Structures

Approaches to learning Bayesian networks from data typically combine a s...

SparsityBoost: A New Scoring Function for Learning Bayesian Network Structure

We give a new consistent scoring function for structure learning of Baye...

Efficient Structure Learning and Sampling of Bayesian Networks

Bayesian networks are probabilistic graphical models widely employed to ...

Learning Bayesian Networks: The Combination of Knowledge and Statistical Data

We describe algorithms for learning Bayesian networks from a combination...

Smoothness and Structure Learning by Proxy

As data sets grow in size, the ability of learning methods to find struc...

A Birth and Death Process for Bayesian Network Structure Inference

Bayesian networks (BNs) are graphical models that are useful for represe...

An Improved Admissible Heuristic for Learning Optimal Bayesian Networks

Recently two search algorithms, A* and breadth-first branch and bound (B...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.