Structural Learning of Simple Staged Trees

by   Manuele Leonelli, et al.

Bayesian networks faithfully represent the symmetric conditional independences existing between the components of a random vector. Staged trees are an extension of Bayesian networks for categorical random vectors whose graph represents non-symmetric conditional independences via vertex coloring. However, since they are based on a tree representation of the sample space, the underlying graph becomes cluttered and difficult to visualize as the number of variables increases. Here we introduce the first structural learning algorithms for the class of simple staged trees, entertaining a compact coalescence of the underlying tree from which non-symmetric independences can be easily read. We show that data-learned simple staged trees often outperform Bayesian networks in model fit and illustrate how the coalesced graph is used to identify non-symmetric conditional independences.


page 1

page 2

page 3

page 4


Staged trees and asymmetry-labeled DAGs

Bayesian networks are a widely-used class of probabilistic graphical mod...

Highly Efficient Structural Learning of Sparse Staged Trees

Several structural learning algorithms for staged tree models, an asymme...

Generalizing Tree Probability Estimation via Bayesian Networks

Probability estimation is one of the fundamental tasks in statistics and...

Equations defining probability tree models

Coloured probability tree models are statistical models coding condition...

Interpolating Conditional Density Trees

Joint distributions over many variables are frequently modeled by decomp...

Lazy Evaluation of Symmetric Bayesian Decision Problems

Solving symmetric Bayesian decision problems is a computationally intens...

Estimating Well-Performing Bayesian Networks using Bernoulli Mixtures

A novel method for estimating Bayesian network (BN) parameters from data...