Hop-Aware Dimension Optimization for Graph Neural Networks

05/30/2021
by   Ailing Zeng, et al.
0

In Graph Neural Networks (GNNs), the embedding of each node is obtained by aggregating information with its direct and indirect neighbors. As the messages passed among nodes contain both information and noise, the critical issue in GNN representation learning is how to retrieve information effectively while suppressing noise. Generally speaking, interactions with distant nodes usually introduce more noise for a particular node than those with close nodes. However, in most existing works, the messages being passed among nodes are mingled together, which is inefficient from a communication perspective. Mixing the information from clean sources (low-order neighbors) and noisy sources (high-order neighbors) makes discriminative feature extraction challenging. Motivated by the above, we propose a simple yet effective ladder-style GNN architecture, namely LADDER-GNN. Specifically, we separate messages from different hops and assign different dimensions for them before concatenating them to obtain the node representation. Such disentangled representations facilitate extracting information from messages passed from different hops, and their corresponding dimensions are determined with a reinforcement learning-based neural architecture search strategy. The resulted hop-aware representations generally contain more dimensions for low-order neighbors and fewer dimensions for high-order neighbors, leading to a ladder-style aggregation scheme. We verify the proposed LADDER-GNN on several semi-supervised node classification datasets. Experimental results show that the proposed simple hop-aware representation learning solution can achieve state-of-the-art performance on most datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/21/2020

Hop-Hop Relation-aware Graph Neural Networks

Graph Neural Networks (GNNs) are widely used in graph representation lea...
research
04/15/2023

Hierarchical and Contrastive Representation Learning for Knowledge-aware Recommendation

Incorporating knowledge graph into recommendation is an effective way to...
research
10/03/2022

TPGNN: Learning High-order Information in Dynamic Graphs via Temporal Propagation

Temporal graph is an abstraction for modeling dynamic systems that consi...
research
05/11/2022

NDGGNET-A Node Independent Gate based Graph Neural Networks

Graph Neural Networks (GNNs) is an architecture for structural data, and...
research
02/12/2021

Two Sides of the Same Coin: Heterophily and Oversmoothing in Graph Convolutional Neural Networks

Most graph neural networks (GNN) perform poorly in graphs where neighbor...
research
08/23/2020

Tree Structure-Aware Graph Representation Learning via Integrated Hierarchical Aggregation and Relational Metric Learning

While Graph Neural Network (GNN) has shown superiority in learning node ...
research
05/19/2022

Spatial Autoregressive Coding for Graph Neural Recommendation

Graph embedding methods including traditional shallow models and deep Gr...

Please sign up or login with your details

Forgot password? Click here to reset