RDGSL: Dynamic Graph Representation Learning with Structure Learning

by   Siwei Zhang, et al.

Temporal Graph Networks (TGNs) have shown remarkable performance in learning representation for continuous-time dynamic graphs. However, real-world dynamic graphs typically contain diverse and intricate noise. Noise can significantly degrade the quality of representation generation, impeding the effectiveness of TGNs in downstream tasks. Though structure learning is widely applied to mitigate noise in static graphs, its adaptation to dynamic graph settings poses two significant challenges. i) Noise dynamics. Existing structure learning methods are ill-equipped to address the temporal aspect of noise, hampering their effectiveness in such dynamic and ever-changing noise patterns. ii) More severe noise. Noise may be introduced along with multiple interactions between two nodes, leading to the re-pollution of these nodes and consequently causing more severe noise compared to static graphs. In this paper, we present RDGSL, a representation learning method in continuous-time dynamic graphs. Meanwhile, we propose dynamic graph structure learning, a novel supervisory signal that empowers RDGSL with the ability to effectively combat noise in dynamic graphs. To address the noise dynamics issue, we introduce the Dynamic Graph Filter, where we innovatively propose a dynamic noise function that dynamically captures both current and historical noise, enabling us to assess the temporal aspect of noise and generate a denoised graph. We further propose the Temporal Embedding Learner to tackle the challenge of more severe noise, which utilizes an attention mechanism to selectively turn a blind eye to noisy edges and hence focus on normal edges, enhancing the expressiveness for representation generation that remains resilient to noise. Our method demonstrates robustness towards downstream tasks, resulting in up to 5.1 evolving classification versus the second-best baseline.


DyTed: Disentangling Temporal Invariance and Fluctuations in Dynamic Graph Representation Learning

Unsupervised representation learning for dynamic graphs has attracted a ...

Piecewise-Velocity Model for Learning Continuous-time Dynamic Node Representations

Networks have become indispensable and ubiquitous structures in many fie...

ConvDySAT: Deep Neural Representation Learning on Dynamic Graphs via Self-Attention and Convolutional Neural Networks

Learning node representations on temporal graphs is a fundamental step t...

Learning Dynamic Graph Embeddings with Neural Controlled Differential Equations

This paper focuses on representation learning for dynamic graphs with te...

Time-aware Graph Structure Learning via Sequence Prediction on Temporal Graphs

Temporal Graph Learning, which aims to model the time-evolving nature of...

Efficient Dynamic Graph Representation Learning at Scale

Dynamic graphs with ordered sequences of events between nodes are preval...

Modeling and Mining Multi-Aspect Graphs With Scalable Streaming Tensor Decomposition

Graphs emerge in almost every real-world application domain, ranging fro...

Please sign up or login with your details

Forgot password? Click here to reset