On the Use of Skeletons when Learning in Bayesian Networks

01/16/2013
by   Harald Steck, et al.
0

In this paper, we present a heuristic operator which aims at simultaneously optimizing the orientations of all the edges in an intermediate Bayesian network structure during the search process. This is done by alternating between the space of directed acyclic graphs (DAGs) and the space of skeletons. The found orientations of the edges are based on a scoring function rather than on induced conditional independences. This operator can be used as an extension to commonly employed search strategies. It is evaluated in experiments with artificial and real-world data.

READ FULL TEXT

Authors

page 1

page 2

page 3

page 4

02/13/2013

Learning Equivalence Classes of Bayesian Networks Structures

Approaches to learning Bayesian networks from data typically combine a s...
09/26/2013

SparsityBoost: A New Scoring Function for Learning Bayesian Network Structure

We give a new consistent scoring function for structure learning of Baye...
03/21/2018

Efficient Structure Learning and Sampling of Bayesian Networks

Bayesian networks are probabilistic graphical models widely employed to ...
02/27/2013

Learning Bayesian Networks: The Combination of Knowledge and Statistical Data

We describe algorithms for learning Bayesian networks from a combination...
06/27/2012

Smoothness and Structure Learning by Proxy

As data sets grow in size, the ability of learning methods to find struc...
10/01/2016

A Birth and Death Process for Bayesian Network Structure Inference

Bayesian networks (BNs) are graphical models that are useful for represe...
10/16/2012

An Improved Admissible Heuristic for Learning Optimal Bayesian Networks

Recently two search algorithms, A* and breadth-first branch and bound (B...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.