Incorporating Type II Error Probabilities from Independence Tests into Score-Based Learning of Bayesian Network Structure

05/12/2015
by   Eliot Brenner, et al.
0

We give a new consistent scoring function for structure learning of Bayesian networks. In contrast to traditional approaches to score-based structure learning, such as BDeu or MDL, the complexity penalty that we propose is data-dependent and is given by the probability that a conditional independence test correctly shows that an edge cannot exist. What really distinguishes this new scoring function from earlier work is that it has the property of becoming computationally easier to maximize as the amount of data increases. We prove a polynomial sample complexity result, showing that maximizing this score is guaranteed to correctly learn a structure with no false edges and a distribution close to the generating distribution, whenever there exists a Bayesian network which is a perfect map for the data generating distribution. Although the new score can be used with any search algorithm, in our related UAI 2013 paper [BS13], we have given empirical results showing that it is particularly effective when used together with a linear programming relaxation approach to Bayesian network structure learning. The present paper contains all details of the proofs of the finite-sample complexity results in [BS13] as well as detailed explanation of the computation of the certain error probabilities called beta-values, whose precomputation and tabulation is necessary for the implementation of the algorithm in [BS13].

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/26/2013

SparsityBoost: A New Scoring Function for Learning Bayesian Network Structure

We give a new consistent scoring function for structure learning of Baye...
research
02/13/2013

On the Sample Complexity of Learning Bayesian Networks

In recent years there has been an increasing interest in learning Bayesi...
research
03/25/2021

Active Structure Learning of Bayesian Networks in an Observational Setting

We study active structure learning of Bayesian networks in an observatio...
research
12/05/2022

A Comprehensively Improved Hybrid Algorithm for Learning Bayesian Networks: Multiple Compound Memory Erasing

Using a Bayesian network to analyze the causal relationship between node...
research
06/20/2012

On Sensitivity of the MAP Bayesian Network Structure to the Equivalent Sample Size Parameter

BDeu marginal likelihood score is a popular model selection criterion fo...
research
01/10/2013

Conditions Under Which Conditional Independence and Scoring Methods Lead to Identical Selection of Bayesian Network Models

It is often stated in papers tackling the task of inferring Bayesian net...
research
02/06/2013

Learning Belief Networks in Domains with Recursively Embedded Pseudo Independent Submodels

A pseudo independent (PI) model is a probabilistic domain model (PDM) wh...

Please sign up or login with your details

Forgot password? Click here to reset