Streaming Graph Neural Networks via Continual Learning

09/23/2020
by   Junshan Wang, et al.
12

Graph neural networks (GNNs) have achieved strong performance in various applications. In the real world, network data is usually formed in a streaming fashion. The distributions of patterns that refer to neighborhood information of nodes may shift over time. The GNN model needs to learn the new patterns that cannot yet be captured. But learning incrementally leads to the catastrophic forgetting problem that historical knowledge is overwritten by newly learned knowledge. Therefore, it is important to train GNN model to learn new patterns and maintain existing patterns simultaneously, which few works focus on. In this paper, we propose a streaming GNN model based on continual learning so that the model is trained incrementally and up-to-date node representations can be obtained at each time step. Firstly, we design an approximation algorithm to detect new coming patterns efficiently based on information propagation. Secondly, we combine two perspectives of data replaying and model regularization for existing pattern consolidation. Specially, a hierarchy-importance sampling strategy for nodes is designed and a weighted regularization term for GNN parameters is derived, achieving greater stability and generalization of knowledge consolidation. Our model is evaluated on real and synthetic data sets and compared with multiple baselines. The results of node classification prove that our model can efficiently update model parameters and achieve comparable performance to model retraining. In addition, we also conduct a case study on the synthetic data, and carry out some specific analysis for each part of our model, illustrating its ability to learn new knowledge and maintain existing knowledge from different perspectives.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2023

Continual Learning on Dynamic Graphs via Parameter Isolation

Many real-world graph learning tasks require handling dynamic graphs whe...
research
08/27/2023

Universal Graph Continual Learning

We address catastrophic forgetting issues in graph learning as incoming ...
research
09/15/2023

Continual Learning with Deep Streaming Regularized Discriminant Analysis

Continual learning is increasingly sought after in real world machine le...
research
12/10/2020

Overcoming Catastrophic Forgetting in Graph Neural Networks

Catastrophic forgetting refers to the tendency that a neural network "fo...
research
06/11/2021

TrafficStream: A Streaming Traffic Flow Forecasting Framework Based on Graph Neural Networks and Continual Learning

With the rapid growth of traffic sensors deployed, a massive amount of t...
research
04/22/2020

Continual Learning of Object Instances

We propose continual instance learning - a method that applies the conce...
research
09/03/2022

Continual Learning for Steganalysis

To detect the existing steganographic algorithms, recent steganalysis me...

Please sign up or login with your details

Forgot password? Click here to reset