Catastrophic Forgetting in Deep Graph Networks: an Introductory Benchmark for Graph Classification

03/22/2021
by   Antonio Carta, et al.
8

In this work, we study the phenomenon of catastrophic forgetting in the graph representation learning scenario. The primary objective of the analysis is to understand whether classical continual learning techniques for flat and sequential data have a tangible impact on performances when applied to graph data. To do so, we experiment with a structure-agnostic model and a deep graph network in a robust and controlled environment on three different datasets. The benchmark is complemented by an investigation on the effect of structure-preserving regularization techniques on catastrophic forgetting. We find that replay is the most effective strategy in so far, which also benefits the most from the use of regularization. Our findings suggest interesting future research at the intersection of the continual and graph representation learning fields. Finally, we provide researchers with a flexible software framework to reproduce our results and carry out further experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/18/2021

Does Continual Learning = Catastrophic Forgetting?

Continual learning is known for suffering from catastrophic forgetting, ...
research
11/25/2022

Overcoming Catastrophic Forgetting by XAI

Explaining the behaviors of deep neural networks, usually considered as ...
research
11/30/2021

Hierarchical Prototype Networks for Continual Graph Representation Learning

Despite significant advances in graph representation learning, little at...
research
02/07/2023

Utility-based Perturbed Gradient Descent: An Optimizer for Continual Learning

Modern representation learning methods may fail to adapt quickly under n...
research
05/26/2023

Mitigating Catastrophic Forgetting in Long Short-Term Memory Networks

Continual learning on sequential data is critical for many machine learn...
research
10/17/2022

Review Learning: Alleviating Catastrophic Forgetting with Generative Replay without Generator

When a deep learning model is sequentially trained on different datasets...
research
08/24/2022

Lifelong Learning for Neural powered Mixed Integer Programming

Mixed Integer programs (MIPs) are typically solved by the Branch-and-Bou...

Please sign up or login with your details

Forgot password? Click here to reset