
Causal Generative Neural Networks
We introduce CGNN, a framework to learn functional causal models as gene...
read it

A Primer on Causal Analysis
We provide a conceptual map to navigate causal analysis problems. Focusi...
read it

Discovering causal structures in binary exclusiveor skew acyclic models
Discovering causal relations among observed variables in a given data se...
read it

Causal Discovery in a Binary Exclusiveor Skew Acyclic Model: BExSAM
Discovering causal relations among observed variables in a given data se...
read it

Bayesian causal inference in probit graphical models
We consider a binary response which is potentially affected by a set of ...
read it

Threequarter Sibling Regression for Denoising Observational Data
Many ecological studies and conservation policies are based on field obs...
read it

Masking schemes for universal marginalisers
We consider the effect of structureagnostic and structuredependent mas...
read it
Learning Functional Causal Models with Generative Neural Networks
We introduce a new approach to functional causal modeling from observational data. The approach, called Causal Generative Neural Networks (CGNN), leverages the power of neural networks to learn a generative model of the joint distribution of the observed variables, by minimizing the Maximum Mean Discrepancy between generated and observed data. An approximate learning criterion is proposed to scale the computational cost of the approach to linear complexity in the number of observations. The performance of CGNN is studied throughout three experiments. First, we apply CGNN to the problem of causeeffect inference, where two CGNNs model P(YX,noise) and P(XY,noise) identify the best causal hypothesis out of X→ Y and Y→ X. Second, CGNN is applied to the problem of identifying vstructures and conditional independences. Third, we apply CGNN to problem of multivariate functional causal modeling: given a skeleton describing the dependences in a set of random variables {X_1, ..., X_d}, CGNN orients the edges in the skeleton to uncover the directed acyclic causal graph describing the causal structure of the random variables. On all three tasks, CGNN is extensively assessed on both artificial and realworld data, comparing favorably to the stateoftheart. Finally, we extend CGNN to handle the case of confounders, where latent variables are involved in the overall causal model.
READ FULL TEXT
Comments
There are no comments yet.