Gradient-based Causal Structure Learning with Normalizing Flow

10/07/2020
by   Xiongren Chen, et al.
0

In this paper, we propose a score-based normalizing flow method called DAG-NF to learn dependencies of input observation data. Inspired by Grad-CAM in computer vision, we use jacobian matrix of output on input as causal relationships and this method can be generalized to any neural networks especially for flow-based generative neural networks such as Masked Autoregressive Flow(MAF) and Continuous Normalizing Flow(CNF) which compute the log likelihood loss and divergence of distribution of input data and target distribution. This method extends NOTEARS which enforces a important acylicity constraint on continuous adjacency matrix of graph nodes and significantly reduce the computational complexity of search space of graph.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/07/2020

Physical System for Non Time Sequence Data

We propose a novelty approach to connect machine learning to causal stru...
research
08/05/2019

Dimensionality Reduction Flows

Deep generative modelling using flows has gained popularity owing to the...
research
10/27/2020

Shapley Flow: A Graph-based Approach to Interpreting Model Predictions

Many existing approaches for estimating feature importance are problemat...
research
12/20/2019

Assurance via workflow+ modelling and conformance

We propose considering assurance as a model management enterprise: sayin...
research
11/18/2019

A Graph Autoencoder Approach to Causal Structure Learning

Causal structure learning has been a challenging task in the past decade...
research
10/18/2019

Masked Gradient-Based Causal Structure Learning

Learning causal graphical models based on directed acyclic graphs is an ...
research
04/03/2023

Non-Generative Energy Based Models

Energy-based models (EBM) have become increasingly popular within comput...

Please sign up or login with your details

Forgot password? Click here to reset