Tree Decomposed Graph Neural Network

08/25/2021
by   Yu Wang, et al.
0

Graph Neural Networks (GNNs) have achieved significant success in learning better representations by performing feature propagation and transformation iteratively to leverage neighborhood information. Nevertheless, iterative propagation restricts the information of higher-layer neighborhoods to be transported through and fused with the lower-layer neighborhoods', which unavoidably results in feature smoothing between neighborhoods in different layers and can thus compromise the performance, especially on heterophily networks. Furthermore, most deep GNNs only recognize the importance of higher-layer neighborhoods while yet to fully explore the importance of multi-hop dependency within the context of different layer neighborhoods in learning better representations. In this work, we first theoretically analyze the feature smoothing between neighborhoods in different layers and empirically demonstrate the variance of the homophily level across neighborhoods at different layers. Motivated by these analyses, we further propose a tree decomposition method to disentangle neighborhoods in different layers to alleviate feature smoothing among these layers. Moreover, we characterize the multi-hop dependency via graph diffusion within our tree decomposition formulation to construct Tree Decomposed Graph Neural Network (TDGNN), which can flexibly incorporate information from large receptive fields and aggregate this information utilizing the multi-hop dependency. Comprehensive experiments demonstrate the superior performance of TDGNN on both homophily and heterophily networks under a variety of node classification settings. Extensive parameter analysis highlights the ability of TDGNN to prevent over-smoothing and incorporate features from shallow layers with deeper multi-hop dependencies, which provides new insights towards deeper graph neural networks. Code of TDGNN: http://github.com/YuWVandy/TDGNN

READ FULL TEXT
research
10/15/2021

A Dual-Perception Graph Neural Network with Multi-hop Graph Generator

Graph neural networks (GNNs) have drawn increasing attention in recent y...
research
12/30/2020

Adaptive Graph Diffusion Networks with Hop-wise Attention

Graph Neural Networks (GNNs) have received much attention recent years a...
research
06/17/2021

MHNF: Multi-hop Heterogeneous Neighborhood information Fusion graph representation learning

Attention mechanism enables the Graph Neural Networks(GNNs) to learn the...
research
08/13/2022

ULDGNN: A Fragmented UI Layer Detector Based on Graph Neural Networks

While some work attempt to generate front-end code intelligently from UI...
research
03/29/2021

RAN-GNNs: breaking the capacity limits of graph neural networks

Graph neural networks have become a staple in problems addressing learni...
research
08/23/2021

Graph Attention Multi-Layer Perceptron

Graph neural networks (GNNs) have recently achieved state-of-the-art per...
research
04/14/2023

AGNN: Alternating Graph-Regularized Neural Networks to Alleviate Over-Smoothing

Graph Convolutional Network (GCN) with the powerful capacity to explore ...

Please sign up or login with your details

Forgot password? Click here to reset