Uniting Heterogeneity, Inductiveness, and Efficiency for Graph Representation Learning

04/04/2021
by   Tong Chen, et al.
15

With the ubiquitous graph-structured data in various applications, models that can learn compact but expressive vector representations of nodes have become highly desirable. Recently, bearing the message passing paradigm, graph neural networks (GNNs) have greatly advanced the performance of node representation learning on graphs. However, a majority class of GNNs are only designed for homogeneous graphs, leading to inferior adaptivity to the more informative heterogeneous graphs with various types of nodes and edges. Also, despite the necessity of inductively producing representations for completely new nodes (e.g., in streaming scenarios), few heterogeneous GNNs can bypass the transductive learning scheme where all nodes must be known during training. Furthermore, the training efficiency of most heterogeneous GNNs has been hindered by their sophisticated designs for extracting the semantics associated with each meta path or relation. In this paper, we propose WIde and DEep message passing Network (WIDEN) to cope with the aforementioned problems about heterogeneity, inductiveness, and efficiency that are rarely investigated together in graph representation learning. In WIDEN, we propose a novel inductive, meta path-free message passing scheme that packs up heterogeneous node features with their associated edges from both low- and high-order neighbor nodes. To further improve the training efficiency, we innovatively present an active downsampling strategy that drops unimportant neighbor nodes to facilitate faster information propagation. Experiments on three real-world heterogeneous graphs have further validated the efficacy of WIDEN on both transductive and inductive node representation learning, as well as the superior training efficiency against state-of-the-art baselines.

READ FULL TEXT
research
11/19/2022

EDEN: A Plug-in Equivariant Distance Encoding to Beyond the 1-WL Test

The message-passing scheme is the core of graph representation learning....
research
02/22/2023

HINormer: Representation Learning On Heterogeneous Information Networks with Graph Transformer

Recent studies have highlighted the limitations of message-passing based...
research
08/13/2023

SAILOR: Structural Augmentation Based Tail Node Representation Learning

Graph Neural Networks (GNNs) have achieved state-of-the-art performance ...
research
04/16/2021

Reinforced Neighborhood Selection Guided Multi-Relational Graph Neural Networks

Graph Neural Networks (GNNs) have been widely used for the representatio...
research
02/02/2023

Predicting the Silent Majority on Graphs: Knowledge Transferable Graph Neural Network

Graphs consisting of vocal nodes ("the vocal minority") and silent nodes...
research
07/29/2023

Graph Condensation for Inductive Node Representation Learning

Graph neural networks (GNNs) encounter significant computational challen...
research
06/23/2023

PathMLP: Smooth Path Towards High-order Homophily

Real-world graphs exhibit increasing heterophily, where nodes no longer ...

Please sign up or login with your details

Forgot password? Click here to reset