Dynamic Graph Representation Learning via Graph Transformer Networks

11/19/2021
by   Weilin Cong, et al.
10

Dynamic graph representation learning is an important task with widespread applications. Previous methods on dynamic graph learning are usually sensitive to noisy graph information such as missing or spurious connections, which can yield degenerated performance and generalization. To overcome this challenge, we propose a Transformer-based dynamic graph learning method named Dynamic Graph Transformer (DGT) with spatial-temporal encoding to effectively learn graph topology and capture implicit links. To improve the generalization ability, we introduce two complementary self-supervised pre-training tasks and show that jointly optimizing the two pre-training tasks results in a smaller Bayesian error rate via an information-theoretic analysis. We also propose a temporal-union graph structure and a target-context node sampling strategy for efficient and scalable training. Extensive experiments on real-world datasets illustrate that DGT presents superior performance compared with several state-of-the-art baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/26/2021

Pairwise Half-graph Discrimination: A Simple Graph-level Self-supervised Strategy for Pre-training Graph Neural Networks

Self-supervised learning has gradually emerged as a powerful technique f...
research
04/20/2023

Recurrent Transformer for Dynamic Graph Representation Learning with Edge Temporal States

Dynamic graph representation learning is growing as a trending yet chall...
research
06/14/2023

Self-supervised Learning and Graph Classification under Heterophily

Self-supervised learning has shown its promising capability in graph rep...
research
10/19/2022

Self-supervised Heterogeneous Graph Pre-training Based on Structural Clustering

Recent self-supervised pre-training methods on Heterogeneous Information...
research
03/02/2022

Self-supervised Transformer for Deepfake Detection

The fast evolution and widespread of deepfake techniques in real-world s...
research
06/12/2021

Curriculum Pre-Training Heterogeneous Subgraph Transformer for Top-N Recommendation

Due to the flexibility in modelling data heterogeneity, heterogeneous in...
research
03/23/2023

Towards Better Dynamic Graph Learning: New Architecture and Unified Library

We propose DyGFormer, a new Transformer-based architecture for dynamic g...

Please sign up or login with your details

Forgot password? Click here to reset