TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning

05/17/2021
by   Lu Wang, et al.
11

Dynamic graph modeling has recently attracted much attention due to its extensive applications in many real-world scenarios, such as recommendation systems, financial transactions, and social networks. Although many works have been proposed for dynamic graph modeling in recent years, effective and scalable models are yet to be developed. In this paper, we propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion and enables effective dynamic node representation learning that captures both the temporal and topology information. Technically, our model contains three novel aspects. First, we generalize the vanilla Transformer to temporal graph learning scenarios and design a graph-topology-aware transformer. Secondly, on top of the proposed graph transformer, we introduce a two-stream encoder that separately extracts representations from temporal neighborhoods associated with the two interaction nodes and then utilizes a co-attentional transformer to model inter-dependencies at a semantic level. Lastly, we are inspired by the recently developed contrastive learning and propose to optimize our model by maximizing mutual information (MI) between the predictive representations of two future interaction nodes. Benefiting from this, our dynamic representations can preserve high-level (or global) semantics about interactions and thus is robust to noisy interactions. To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs. We evaluate our model on four benchmark datasets for interaction prediction and experiment results demonstrate the superiority of our model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/04/2022

Efficient-Dyn: Dynamic Graph Representation Learning via Event-based Temporal Sparse Attention Network

Static graph neural networks have been widely used in modeling and repre...
research
12/19/2021

Representation Learning for Dynamic Hyperedges

Recently there has been a massive interest in extracting information fro...
research
06/13/2019

Contrastive Bidirectional Transformer for Temporal Representation Learning

This paper aims at learning representations for long sequences of contin...
research
04/20/2023

Recurrent Transformer for Dynamic Graph Representation Learning with Edge Temporal States

Dynamic graph representation learning is growing as a trending yet chall...
research
05/14/2023

Decoupled Graph Neural Networks for Large Dynamic Graphs

Real-world graphs, such as social networks, financial transactions, and ...
research
06/09/2023

Intensity Profile Projection: A Framework for Continuous-Time Representation Learning for Dynamic Networks

We present a new algorithmic framework, Intensity Profile Projection, fo...
research
06/30/2022

Continuous-Time and Multi-Level Graph Representation Learning for Origin-Destination Demand Prediction

Traffic demand forecasting by deep neural networks has attracted widespr...

Please sign up or login with your details

Forgot password? Click here to reset