ROLAND: Graph Learning Framework for Dynamic Graphs

08/15/2022
by   Jiaxuan You, et al.
6

Graph Neural Networks (GNNs) have been successfully applied to many real-world static graphs. However, the success of static graphs has not fully translated to dynamic graphs due to the limitations in model design, evaluation settings, and training strategies. Concretely, existing dynamic GNNs do not incorporate state-of-the-art designs from static GNNs, which limits their performance. Current evaluation settings for dynamic GNNs do not fully reflect the evolving nature of dynamic graphs. Finally, commonly used training methods for dynamic GNNs are not scalable. Here we propose ROLAND, an effective graph representation learning framework for real-world dynamic graphs. At its core, the ROLAND framework can help researchers easily repurpose any static GNN to dynamic graphs. Our insight is to view the node embeddings at different GNN layers as hierarchical node states and then recurrently update them over time. We then introduce a live-update evaluation setting for dynamic graphs that mimics real-world use cases, where GNNs are making predictions and being updated on a rolling basis. Finally, we propose a scalable and efficient training approach for dynamic GNNs via incremental training and meta-learning. We conduct experiments over eight different dynamic graph datasets on future link prediction tasks. Models built using the ROLAND framework achieve on average 62.7 state-of-the-art baselines under the standard evaluation settings on three datasets. We find state-of-the-art baselines experience out-of-memory errors for larger datasets, while ROLAND can easily scale to dynamic graphs with 56 million edges. After re-implementing these baselines using the ROLAND training strategy, ROLAND models still achieve on average 15.5 over the baselines.

READ FULL TEXT
research
07/22/2022

Explaining Dynamic Graph Neural Networks via Relevance Back-propagation

Graph Neural Networks (GNNs) have shown remarkable effectiveness in capt...
research
08/15/2023

Towards Temporal Edge Regression: A Case Study on Agriculture Trade Between Nations

Recently, Graph Neural Networks (GNNs) have shown promising performance ...
research
09/11/2022

Towards Sparsification of Graph Neural Networks

As real-world graphs expand in size, larger GNN models with billions of ...
research
03/12/2021

On the Equivalence Between Temporal and Static Graph Representations for Observational Predictions

In this work we formalize the (pure observational) task of predicting no...
research
07/20/2022

Towards Better Evaluation for Dynamic Link Prediction

There has been recent success in learning from static graphs, but despit...
research
06/07/2023

Fast and Effective GNN Training with Linearized Random Spanning Trees

We present a new effective and scalable framework for training GNNs in s...
research
12/20/2021

Lifelong Learning in Evolving Graphs with Limited Labeled Data and Unseen Class Detection

Large-scale graph data in the real-world are often dynamic rather than s...

Please sign up or login with your details

Forgot password? Click here to reset