DeepAI AI Chat
Log In Sign Up

Nested Covariance Determinants and Restricted Trek Separation in Gaussian Graphical Models

by   Mathias Drton, et al.
University of Washington

Directed graphical models specify noisy functional relationships among a collection of random variables. In the Gaussian case, each such model corresponds to a semi-algebraic set of positive definite covariance matrices. The set is given via parametrization, and much work has gone into obtaining an implicit description in terms of polynomial (in-)equalities. Implicit descriptions shed light on problems such as parameter identification, model equivalence, and constraint-based statistical inference. For models given by directed acyclic graphs, which represent settings where all relevant variables are observed, there is a complete theory: All conditional independence relations can be found via graphical d-separation and are sufficient for an implicit description. The situation is far more complicated, however, when some of the variables are hidden (or in other words, unobserved or latent). We consider models associated to mixed graphs that capture the effects of hidden variables through correlated error terms. The notion of trek separation explains when the covariance matrix in such a model has submatrices of low rank and generalizes d-separation. However, in many cases, such as the infamous Verma graph, the polynomials defining the graphical model are not determinantal, and hence cannot be explained by d-separation or trek-separation. In this paper, we show that these constraints often correspond to the vanishing of nested determinants and can be graphically explained by a notion of restricted trek separation.


page 1

page 2

page 3

page 4


Gaussian graphical models with toric vanishing ideals

Gaussian graphical models are semi-algebraic subsets of the cone of posi...

Learning Directed Graphical Models from Gaussian Data

In this paper, we introduce two new directed graphical models from Gauss...

Algebraic tests of general Gaussian latent tree models

We consider general Gaussian latent tree models in which the observed va...

Multi-trek separation in Linear Structural Equation Models

Building on the theory of causal discovery from observational data, we s...

Parameter Priors for Directed Acyclic Graphical Models and the Characterization of Several Probability Distributions

We show that the only parameter prior for complete Gaussian DAG models t...

One-connection rule for structural equation models

Linear structural equation models are multivariate statistical models en...

Bell's theorem is an exercise in the statistical theory of causality

In this short note, I derive the Bell-CHSH inequalities as an elementary...