A Sound and Complete Algorithm for Learning Causal Models from Relational Data

09/26/2013
by   Marc Maier, et al.
0

The PC algorithm learns maximally oriented causal Bayesian networks. However, there is no equivalent complete algorithm for learning the structure of relational models, a more expressive generalization of Bayesian networks. Recent developments in the theory and representation of relational models support lifted reasoning about conditional independence. This enables a powerful constraint for orienting bivariate dependencies and forms the basis of a new algorithm for learning structure. We present the relational causal discovery (RCD) algorithm that learns causal relational models. We prove that RCD is sound and complete, and we present empirical results that demonstrate effectiveness.

READ FULL TEXT
research
08/25/2022

Learning Relational Causal Models with Cycles through Relational Acyclification

In real-world phenomena which involve mutual influence or causal effects...
research
06/15/2012

Identifying Independence in Relational Models

The rules of d-separation provide a framework for deriving conditional i...
research
12/05/2019

Towards Robust Relational Causal Discovery

We consider the problem of learning causal relationships from relational...
research
08/10/2015

Lifted Representation of Relational Causal Models Revisited: Implications for Reasoning and Structure Learning

Maier et al. (2010) introduced the relational causal model (RCM) for rep...
research
03/02/2016

Probabilistic Relational Model Benchmark Generation

The validation of any database mining methodology goes through an evalua...
research
02/22/2022

Relational Causal Models with Cycles:Representation and Reasoning

Causal reasoning in relational domains is fundamental to studying real-w...
research
09/26/2013

Learning Sparse Causal Models is not NP-hard

This paper shows that causal model discovery is not an NP-hard problem, ...

Please sign up or login with your details

Forgot password? Click here to reset