Gated Graph Recurrent Neural Networks

02/03/2020
by   Luana Ruiz, et al.
0

Graph processes exhibit a temporal structure determined by the sequence index and and a spatial structure determined by the graph support. To learn from graph processes, an information processing architecture must then be able to exploit both underlying structures. We introduce Graph Recurrent Neural Networks (GRNNs), which achieve this goal by leveraging the hidden Markov model (HMM) together with graph signal processing (GSP). In the GRNN, the number of learnable parameters is independent of the length of the sequence and of the size of the graph, guaranteeing scalability. We also prove that GRNNs are permutation equivariant and that they are stable to perturbations of the underlying graph support. Following the observation that stability decreases with longer sequences, we propose a time-gated extension of GRNNs. We also put forward node- and edge-gated variants of the GRNN to address the problem of vanishing gradients arising from long range graph dependencies. The advantages of GRNNs over GNNs and RNNs are demonstrated in a synthetic regression experiment and in a classification problem where seismic wave readings from a network of seismographs are used to predict the region of an earthquake. Finally, the benefits of time, node and edge gating are experimentally validated in multiple time and spatial correlation scenarios.

READ FULL TEXT

page 1

page 10

research
03/05/2019

Gated Graph Convolutional Recurrent Neural Networks

Graph processes model a number of important problems such as identifying...
research
06/03/2013

Riemannian metrics for neural networks II: recurrent networks and learning symbolic data sequences

Recurrent neural networks are powerful models for sequential data, able ...
research
10/23/2020

Graph and graphon neural network stability

Graph neural networks (GNNs) are learning architectures that rely on kno...
research
06/21/2018

Gated Complex Recurrent Neural Networks

Complex numbers have long been favoured for digital signal processing, y...
research
11/22/2016

Scene Labeling using Gated Recurrent Units with Explicit Long Range Conditioning

Recurrent neural network (RNN), as a powerful contextual dependency mode...
research
08/26/2019

Non-local Recurrent Neural Memory for Supervised Sequence Modeling

Typical methods for supervised sequence modeling are built upon the recu...
research
01/27/2021

Edge-Labeling based Directed Gated Graph Network for Few-shot Learning

Existing graph-network-based few-shot learning methods obtain similarity...

Please sign up or login with your details

Forgot password? Click here to reset