Gated Graph Sequence Neural Networks

11/17/2015
by   Yujia Li, et al.
0

Graph-structured data appears frequently in domains including chemistry, natural language semantics, social networks, and knowledge bases. In this work, we study feature learning techniques for graph-structured inputs. Our starting point is previous work on Graph Neural Networks (Scarselli et al., 2009), which we modify to use gated recurrent units and modern optimization techniques and then extend to output sequences. The result is a flexible and broadly useful class of neural network models that has favorable inductive biases relative to purely sequence-based models (e.g., LSTMs) when the problem is graph-structured. We demonstrate the capabilities on some simple AI (bAbI) and graph algorithm learning tasks. We then show it achieves state-of-the-art performance on a problem from program verification, in which subgraphs need to be matched to abstract data structures.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/20/2017

Residual Gated Graph ConvNets

Graph-structured data such as functional brain networks, social networks...
04/17/2019

Inductive Graph Representation Learning with Recurrent Graph Neural Networks

In this paper, we study the problem of node representation learning with...
12/06/2018

Modelling Identity Rules with Neural Networks

In this paper, we show that standard feed-forward and recurrent neural n...
11/05/2018

Structured Neural Summarization

Summarization of long sequences into a concise statement is a core probl...
06/26/2018

Graph-to-Sequence Learning using Gated Graph Neural Networks

Many NLP applications can be framed as a graph-to-sequence learning prob...
12/22/2016

Structured Sequence Modeling with Graph Convolutional Recurrent Networks

This paper introduces Graph Convolutional Recurrent Network (GCRN), a de...
07/30/2021

Perceiver IO: A General Architecture for Structured Inputs Outputs

The recently-proposed Perceiver model obtains good results on several do...