
Residual Gated Graph ConvNets
Graphstructured data such as functional brain networks, social networks...
read it

Inductive Graph Representation Learning with Recurrent Graph Neural Networks
In this paper, we study the problem of node representation learning with...
read it

Modelling Identity Rules with Neural Networks
In this paper, we show that standard feedforward and recurrent neural n...
read it

Structured Neural Summarization
Summarization of long sequences into a concise statement is a core probl...
read it

GraphtoSequence Learning using Gated Graph Neural Networks
Many NLP applications can be framed as a graphtosequence learning prob...
read it

Structured Sequence Modeling with Graph Convolutional Recurrent Networks
This paper introduces Graph Convolutional Recurrent Network (GCRN), a de...
read it

Perceiver IO: A General Architecture for Structured Inputs Outputs
The recentlyproposed Perceiver model obtains good results on several do...
read it
Gated Graph Sequence Neural Networks
Graphstructured data appears frequently in domains including chemistry, natural language semantics, social networks, and knowledge bases. In this work, we study feature learning techniques for graphstructured inputs. Our starting point is previous work on Graph Neural Networks (Scarselli et al., 2009), which we modify to use gated recurrent units and modern optimization techniques and then extend to output sequences. The result is a flexible and broadly useful class of neural network models that has favorable inductive biases relative to purely sequencebased models (e.g., LSTMs) when the problem is graphstructured. We demonstrate the capabilities on some simple AI (bAbI) and graph algorithm learning tasks. We then show it achieves stateoftheart performance on a problem from program verification, in which subgraphs need to be matched to abstract data structures.
READ FULL TEXT
Comments
There are no comments yet.