Dynamic Network Embedding via Incremental Skip-gram with Negative Sampling

06/09/2019
by   Hao Peng, et al.
7

Network representation learning, as an approach to learn low dimensional representations of vertices, has attracted considerable research attention recently. It has been proven extremely useful in many machine learning tasks over large graph. Most existing methods focus on learning the structural representations of vertices in a static network, but cannot guarantee an accurate and efficient embedding in a dynamic network scenario. To address this issue, we present an efficient incremental skip-gram algorithm with negative sampling for dynamic network embedding, and provide a set of theoretical analyses to characterize the performance guarantee. Specifically, we first partition a dynamic network into the updated, including addition/deletion of links and vertices, and the retained networks over time. Then we factorize the objective function of network embedding into the added, vanished and retained parts of the network. Next we provide a new stochastic gradient-based method, guided by the partitions of the network, to update the nodes and the parameter vectors. The proposed algorithm is proven to yield an objective function value with a bounded difference to that of the original objective function. Experimental results show that our proposal can significantly reduce the training time while preserving the comparable performance. We also demonstrate the correctness of the theoretical analysis and the practical usefulness of the dynamic network embedding. We perform extensive experiments on multiple real-world large network datasets over multi-label classification and link prediction tasks to evaluate the effectiveness and efficiency of the proposed framework, and up to 22 times speedup has been achieved.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/13/2017

Incremental Skip-gram Model with Negative Sampling

This paper explores an incremental training strategy for the skip-gram m...
research
11/14/2018

Streaming Network Embedding through Local Actions

Recently, considerable research attention has been paid to network embed...
research
12/06/2018

dynnode2vec: Scalable Dynamic Network Embedding

Network representation learning in low dimensional vector space has attr...
research
08/05/2020

GloDyNE: Global Topology Preserving Dynamic Network Embedding

Learning low-dimensional topological representation of a network in dyna...
research
07/26/2022

Dynamic Measurement of Structural Entropy for Dynamic Graphs

Structural entropy solves the problem of measuring the amount of informa...
research
05/07/2018

Billion-scale Network Embedding with Iterative Random Projection

Network embedding has attracted considerable research attention recently...
research
06/25/2020

Time-varying Graph Representation Learning via Higher-Order Skip-Gram with Negative Sampling

Representation learning models for graphs are a successful family of tec...

Please sign up or login with your details

Forgot password? Click here to reset