DeepAI AI Chat
Log In Sign Up

Decomposable context-specific models

by   Yulia Alexandr, et al.

We introduce a family of discrete context-specific models, which we call decomposable. We construct this family from the subclass of staged tree models known as CStree models. We give an algebraic and combinatorial characterization of all context-specific independence relations that hold in a decomposable context-specific model, which yields a Markov basis. We prove that the moralization operation applied to the graphical representation of a context-specific model does not affect the implied independence relations, thus affirming that these models are algebraically described by a finite collection of decomposable graphical models. More generally, we establish that several algebraic, combinatorial, and geometric properties of decomposable context-specific models generalize those of decomposable graphical models to the context-specific setting.


page 1

page 2

page 3

page 4


A new characterization of discrete decomposable models

Decomposable graphical models, also known as perfect DAG models, play a ...

YGGDRASIL - A Statistical Package for Learning Split Models

There are two main objectives of this paper. The first is to present a s...

A non-graphical representation of conditional independence via the neighbourhood lattice

We introduce and study the neighbourhood lattice decomposition of a dist...

Graphical Models with Attention for Context-Specific Independence and an Application to Perceptual Grouping

Discrete undirected graphical models, also known as Markov Random Fields...

Context-specific independencies for ordinal variables in chain regression models

In this work we handle with categorical (ordinal) variables and we focus...

Representation and Learning of Context-Specific Causal Models with Observational and Interventional Data

We consider the problem of representation and learning of causal models ...

The Geometry of Gaussoids

A gaussoid is a combinatorial structure that encodes independence in pro...