Hierarchical Prototype Networks for Continual Graph Representation Learning

11/30/2021
by   Xikun Zhang, et al.
0

Despite significant advances in graph representation learning, little attention has been paid to the more practical continual learning scenario in which new categories of nodes (e.g., new research areas in citation networks, or new types of products in co-purchasing networks) and their associated edges are continuously emerging, causing catastrophic forgetting on previous categories. Existing methods either ignore the rich topological information or sacrifice plasticity for stability. To this end, we present Hierarchical Prototype Networks (HPNs) which extract different levels of abstract knowledge in the form of prototypes to represent the continuously expanded graphs. Specifically, we first leverage a set of Atomic Feature Extractors (AFEs) to encode both the elemental attribute information and the topological structure of the target node. Next, we develop HPNs to adaptively select relevant AFEs and represent each node with three levels of prototypes. In this way, whenever a new category of nodes is given, only the relevant AFEs and prototypes at each level will be activated and refined, while others remain uninterrupted to maintain the performance over existing nodes. Theoretically, we first demonstrate that the memory consumption of HPNs is bounded regardless of how many tasks are encountered. Then, we prove that under mild constraints, learning new tasks will not alter the prototypes matched to previous data, thereby eliminating the forgetting problem. The theoretical results are supported by experiments on five datasets, showing that HPNs not only outperform state-of-the-art baseline techniques but also consume relatively less memory.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/06/2020

Disentangle-based Continual Graph Representation Learning

Graph embedding (GE) methods embed nodes (and/or edges) in graph into a ...
research
03/22/2021

Catastrophic Forgetting in Deep Graph Networks: an Introductory Benchmark for Graph Classification

In this work, we study the phenomenon of catastrophic forgetting in the ...
research
10/06/2022

Topological Continual Learning with Wasserstein Distance and Barycenter

Continual learning in neural networks suffers from a phenomenon called c...
research
05/05/2021

Schematic Memory Persistence and Transience for Efficient and Robust Continual Learning

Continual learning is considered a promising step towards next-generatio...
research
05/23/2023

Continual Learning on Dynamic Graphs via Parameter Isolation

Many real-world graph learning tasks require handling dynamic graphs whe...
research
11/26/2020

Better Knowledge Retention through Metric Learning

In continual learning, new categories may be introduced over time, and a...

Please sign up or login with your details

Forgot password? Click here to reset