Scaling Up Dynamic Graph Representation Learning via Spiking Neural Networks

08/15/2022
by   Jintang Li, et al.
0

Recent years have seen a surge in research on dynamic graph representation learning, which aims to model temporal graphs that are dynamic and evolving constantly over time. However, current work typically models graph dynamics with recurrent neural networks (RNNs), making them suffer seriously from computation and memory overheads on large temporal graphs. So far, scalability of dynamic graph representation learning on large temporal graphs remains one of the major challenges. In this paper, we present a scalable framework, namely SpikeNet, to efficiently capture the temporal and structural patterns of temporal graphs. We explore a new direction in that we can capture the evolving dynamics of temporal graphs with spiking neural networks (SNNs) instead of RNNs. As a low-power alternative to RNNs, SNNs explicitly model graph dynamics as spike trains of neuron populations and enable spike-based propagation in an efficient way. Experiments on three large real-world temporal graph datasets demonstrate that SpikeNet outperforms strong baselines on the temporal node classification task with lower computational costs. Particularly, SpikeNet generalizes to a large temporal graph (2M nodes and 13M edges) with significantly fewer parameters and computation overheads. Our code is publicly available at https://github.com/EdisonLeeeee/SpikeNet

READ FULL TEXT
research
04/20/2023

Recurrent Transformer for Dynamic Graph Representation Learning with Edge Temporal States

Dynamic graph representation learning is growing as a trending yet chall...
research
06/30/2021

Exploiting Spiking Dynamics with Spatial-temporal Feature Normalization in Graph Learning

Biological spiking neurons with intrinsic dynamics underlie the powerful...
research
05/18/2022

Relational representation learning with spike trains

Relational representation learning has lately received an increase in in...
research
01/16/2016

Conversion of Artificial Recurrent Neural Networks to Spiking Neural Networks for Low-power Neuromorphic Hardware

In recent years the field of neuromorphic low-power systems that consume...
research
08/02/2021

Representation learning for neural population activity with Neural Data Transformers

Neural population activity is theorized to reflect an underlying dynamic...
research
02/22/2023

Learning Dynamic Graph Embeddings with Neural Controlled Differential Equations

This paper focuses on representation learning for dynamic graphs with te...
research
04/27/2021

SpikE: spike-based embeddings for multi-relational graph data

Despite the recent success of reconciling spike-based coding with the er...

Please sign up or login with your details

Forgot password? Click here to reset