TransformerG2G: Adaptive time-stepping for learning temporal graph embeddings using transformers

07/05/2023
by   Alan John Varghese, et al.
0

Dynamic graph embedding has emerged as a very effective technique for addressing diverse temporal graph analytic tasks (i.e., link prediction, node classification, recommender systems, anomaly detection, and graph generation) in various applications. Such temporal graphs exhibit heterogeneous transient dynamics, varying time intervals, and highly evolving node features throughout their evolution. Hence, incorporating long-range dependencies from the historical graph context plays a crucial role in accurately learning their temporal dynamics. In this paper, we develop a graph embedding model with uncertainty quantification, TransformerG2G, by exploiting the advanced transformer encoder to first learn intermediate node representations from its current state (t) and previous context (over timestamps [t-1, t-l], l is the length of context). Moreover, we employ two projection layers to generate lower-dimensional multivariate Gaussian distributions as each node's latent embedding at timestamp t. We consider diverse benchmarks with varying levels of “novelty" as measured by the TEA plots. Our experiments demonstrate that the proposed TransformerG2G model outperforms conventional multi-step methods and our prior work (DynG2G) in terms of both link prediction accuracy and computational efficiency, especially for high degree of novelty. Furthermore, the learned time-dependent attention weights across multiple graph snapshots reveal the development of an automatic adaptive time stepping enabled by the transformer. Importantly, by examining the attention weights, we can uncover temporal dependencies, identify influential elements, and gain insights into the complex interactions within the graph structure. For example, we identified a strong correlation between attention weights and node degree at the various stages of the graph topology evolution.

READ FULL TEXT

page 4

page 13

page 14

research
09/28/2021

DynG2G: An Efficient Stochastic Graph Embedding Method for Temporal Graphs

Dynamic graph embedding has gained great attention recently due to its c...
research
03/21/2019

Node Embedding over Temporal Graphs

In this work, we present a method for node embedding in temporal graphs....
research
07/25/2020

Learning Attribute-Structure Co-Evolutions in Dynamic Graphs

Most graph neural network models learn embeddings of nodes in static att...
research
06/01/2023

Graph-Level Embedding for Time-Evolving Graphs

Graph representation learning (also known as network embedding) has been...
research
06/13/2023

Time-aware Graph Structure Learning via Sequence Prediction on Temporal Graphs

Temporal Graph Learning, which aims to model the time-evolving nature of...
research
08/27/2020

DVE: Dynamic Variational Embeddings with Applications in Recommender Systems

Embedding is a useful technique to project a high-dimensional feature in...
research
08/28/2019

Effective and Efficient Network Embedding Initialization via Graph Partitioning

Network embedding has been intensively studied in the literature and wid...

Please sign up or login with your details

Forgot password? Click here to reset