Modeling Attention Flow on Graphs

11/01/2018
by   Xiaoran Xu, et al.
6

Real-world scenarios demand reasoning about process, more than final outcome prediction, to discover latent causal chains and better understand complex systems. It requires the learning algorithms to offer both accurate predictions and clear interpretations. We design a set of trajectory reasoning tasks on graphs with only the source and the destination observed. We present the attention flow mechanism to explicitly model the reasoning process, leveraging the relational inductive biases by basing our models on graph networks. We study the way attention flow can effectively act on the underlying information flow implemented by message passing. Experiments demonstrate that the attention flow driven by and interacting with graph networks can provide higher accuracy in prediction and better interpretation for trajectories reasoning.

READ FULL TEXT

page 3

page 18

page 19

page 20

research
09/25/2019

Dynamically Pruned Message Passing Networks for Large-Scale Knowledge Graph Reasoning

We propose Dynamically Pruned Message Passing Networks (DPMPN) for large...
research
08/07/2019

Continuous Graph Flow for Flexible Density Estimation

In this paper, we propose Continuous Graph Flow, a generative continuous...
research
09/29/2019

Policy Message Passing: A New Algorithm for Probabilistic Graph Inference

A general graph-structured neural network architecture operates on graph...
research
05/30/2019

Neural Consciousness Flow

The ability of reasoning beyond data fitting is substantial to deep lear...
research
12/05/2018

Explainable and Explicit Visual Reasoning over Scene Graphs

We aim to dismantle the prevalent black-box neural architectures used in...
research
04/13/2017

Dempster-Shafer Belief Function - A New Interpretation

We develop our interpretation of the joint belief distribution and of ev...

Please sign up or login with your details

Forgot password? Click here to reset