Dynamic Measurement of Structural Entropy for Dynamic Graphs

by   Runze Yang, et al.

Structural entropy solves the problem of measuring the amount of information embedded in graph structure data under a strategy of hierarchical abstracting. In this metric, it is necessary to decode the optimal encoding tree, i.e., the optimal hierarchical abstracting. In dynamic graph scenarios, we usually need to measure the structural entropy of the updated graph at any given time. However, the current structural entropy methods do not support the efficient incremental updating of an encoding tree. To address this issue, we propose a novel incremental measurement method of structural entropy for dynamic graphs. First, we present two new dynamic adjustment strategies for one- and two-dimensional encoding trees. Second, we propose a new metric, namely Global Invariant, to approximate the updated structural entropy in the computational complexity of O(1). Besides, we define another metric, namely Local Difference, as the difference between the updated structural entropy and the Global Invariant, whose computational complexity is O(n). Third, new efficient incremental algorithms, Incre-1dSE and Incre-2dSE, are designed for computing the updated one- and two-dimensional structural entropy. Furthermore, we theoretically prove that the Local Difference and its first-order absolute moment converge to 0 in order of O(log m/m). We conduct sufficient experiments under dynamic graph datasets generated by Hawkes Process, Triad Closure Process, and Partitioning-based Process to evaluate the efficiency of our algorithms and the correctness of the theoretical analysis. Experimental results confirm that our method effectively reduces the time consumption, that up to 3 times speedup for one-dimensional cases and at least 11 times for two-dimensional cases are achieved on average while maintaining relative errors within 2


SE-GSL: A General and Effective Graph Structure Learning Framework through Structural Entropy Optimization

Graph Neural Networks (GNNs) are de facto solutions to structural data l...

Structural Optimization Makes Graph Classification Simpler and Better

In deep neural networks, better results can often be obtained by increas...

On the Similarity between von Neumann Graph Entropy and Structural Information: Interpretation, Computation, and Applications

The von Neumann graph entropy is a measure of graph complexity based on ...

Dynamic Network Embedding via Incremental Skip-gram with Negative Sampling

Network representation learning, as an approach to learn low dimensional...

Fast Incremental von Neumann Graph Entropy Computation: Theory, Algorithm, and Applications

The von Neumann graph entropy (VNGE) facilitates the measure of informat...

Labeled Subgraph Entropy Kernel

In recent years, kernel methods are widespread in tasks of similarity me...

Encoding Incremental NACs in Safe Graph Grammars using Complementation

In modelling complex systems with graph grammars (GGs), it is convenient...

Please sign up or login with your details

Forgot password? Click here to reset