A Sound and Complete Algorithm for Learning Causal Models from Relational Data

09/26/2013
by   Marc Maier, et al.
0

The PC algorithm learns maximally oriented causal Bayesian networks. However, there is no equivalent complete algorithm for learning the structure of relational models, a more expressive generalization of Bayesian networks. Recent developments in the theory and representation of relational models support lifted reasoning about conditional independence. This enables a powerful constraint for orienting bivariate dependencies and forms the basis of a new algorithm for learning structure. We present the relational causal discovery (RCD) algorithm that learns causal relational models. We prove that RCD is sound and complete, and we present empirical results that demonstrate effectiveness.

READ FULL TEXT
06/15/2012

Identifying Independence in Relational Models

The rules of d-separation provide a framework for deriving conditional i...
12/05/2019

Towards Robust Relational Causal Discovery

We consider the problem of learning causal relationships from relational...
02/22/2022

Relational Causal Models with Cycles:Representation and Reasoning

Causal reasoning in relational domains is fundamental to studying real-w...
09/26/2013

Learning Sparse Causal Models is not NP-hard

This paper shows that causal model discovery is not an NP-hard problem, ...
08/10/2015

Lifted Representation of Relational Causal Models Revisited: Implications for Reasoning and Structure Learning

Maier et al. (2010) introduced the relational causal model (RCM) for rep...
08/21/2020

Amortized learning of neural causal representations

Causal models can compactly and efficiently encode the data-generating p...
03/02/2016

Probabilistic Relational Model Benchmark Generation

The validation of any database mining methodology goes through an evalua...