Graph Condensation for Inductive Node Representation Learning

07/29/2023
by   Xinyi Gao, et al.
0

Graph neural networks (GNNs) encounter significant computational challenges when handling large-scale graphs, which severely restricts their efficacy across diverse applications. To address this limitation, graph condensation has emerged as a promising technique, which constructs a small synthetic graph for efficiently training GNNs while retaining performance. However, due to the topology structure among nodes, graph condensation is limited to condensing only the observed training nodes and their corresponding structure, thus lacking the ability to effectively handle the unseen data. Consequently, the original large graph is still required in the inference stage to perform message passing to inductive nodes, resulting in substantial computational demands. To overcome this issue, we propose mapping-aware graph condensation (MCond), explicitly learning the one-to-many node mapping from original nodes to synthetic nodes to seamlessly integrate new nodes into the synthetic graph for inductive representation learning. This enables direct information propagation on the synthetic graph, which is much more efficient than on the original large graph. Specifically, MCond employs an alternating optimization scheme with innovative loss terms from transductive and inductive perspectives, facilitating the mutual promotion between graph condensation and node mapping learning. Extensive experiments demonstrate the efficacy of our approach in inductive inference. On the Reddit dataset, MCond achieves up to 121.5x inference speedup and 55.9x reduction in storage requirements compared with counterparts based on the original graph.

READ FULL TEXT

page 1

page 11

page 12

research
04/04/2021

Uniting Heterogeneity, Inductiveness, and Efficiency for Graph Representation Learning

With the ubiquitous graph-structured data in various applications, model...
research
03/17/2022

Graph Representation Learning with Individualization and Refinement

Graph Neural Networks (GNNs) have emerged as prominent models for repres...
research
11/01/2022

Efficient Graph Neural Network Inference at Large Scale

Graph neural networks (GNNs) have demonstrated excellent performance in ...
research
06/14/2023

NodeFormer: A Scalable Graph Structure Learning Transformer for Node Classification

Graph neural networks have been extensively studied for learning with in...
research
06/05/2023

Structure-free Graph Condensation: From Large-scale Graphs to Condensed Graph-free Data

Graph condensation, which reduces the size of a large-scale graph by syn...
research
10/09/2021

Towards Open-World Feature Extrapolation: An Inductive Graph Learning Approach

We target open-world feature extrapolation problem where the feature spa...
research
06/25/2021

Data efficiency in graph networks through equivariance

We introduce a novel architecture for graph networks which is equivarian...

Please sign up or login with your details

Forgot password? Click here to reset