Inductive Representation Learning in Temporal Networks via Causal Anonymous Walks

01/15/2021
by   Yanbang Wang, et al.
6

Temporal networks serve as abstractions of many real-world dynamic systems. These networks typically evolve according to certain laws, such as the law of triadic closure, which is universal in social networks. Inductive representation learning of temporal networks should be able to capture such laws and further be applied to systems that follow the same laws but have not been unseen during the training stage. Previous works in this area depend on either network node identities or rich edge attributes and typically fail to extract these laws. Here, we propose Causal Anonymous Walks (CAWs) to inductively represent a temporal network. CAWs are extracted by temporal random walks and work as automatic retrieval of temporal network motifs to represent network dynamics while avoiding the time-consuming selection and counting of those motifs. CAWs adopt a novel anonymization strategy that replaces node identities with the hitting counts of the nodes based on a set of sampled walks to keep the method inductive, and simultaneously establish the correlation between motifs. We further propose a neural-network model CAW-N to encode CAWs, and pair it with a CAW sampling strategy with constant memory and time cost to support online training and inference. CAW-N is evaluated to predict links over 6 real temporal networks and uniformly outperforms previous SOTA methods by averaged 15 methods in 5 out of the 6 networks in the transductive setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/19/2023

CAT-Walk: Inductive Hypergraph Learning via Set Walks

Temporal hypergraphs provide a powerful paradigm for modeling time-depen...
research
08/19/2021

Temporal Graph Network Embedding with Causal Anonymous Walks Representations

Many tasks in graph machine learning, such as link prediction and node c...
research
09/02/2022

Neighborhood-aware Scalable Temporal Network Representation Learning

Temporal networks have been widely used to model real-world complex syst...
research
04/12/2019

Temporal Network Representation Learning

Networks evolve continuously over time with the addition, deletion, and ...
research
09/17/2022

De Bruijn goes Neural: Causality-Aware Graph Neural Networks for Time Series Data on Dynamic Graphs

We introduce De Bruijn Graph Neural Networks (DBGNNs), a novel time-awar...
research
03/06/2023

SUREL+: Moving from Walks to Sets for Scalable Subgraph-based Graph Representation Learning

Subgraph-based graph representation learning (SGRL) has recently emerged...
research
03/27/2013

Towards The Inductive Acquisition of Temporal Knowledge

The ability to predict the future in a given domain can be acquired by d...

Please sign up or login with your details

Forgot password? Click here to reset