Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with Heterophily

03/19/2022
by   Jie Chen, et al.
1

Due to the homophily assumption of graph convolution networks, a common consensus is that graph neural networks (GNNs) perform well on homophilic graphs but may fail on the heterophilic graphs with many inter-class edges. In this work, we re-examine the heterophily problem of GNNs and investigate the feature aggregation of inter-class neighbors. To better evaluate whether the neighbor is helpful for the downstream tasks, we present the concept of the neighbor effect of each node and use the von Neumann entropy to measure the randomness/identifiability of the neighbor distribution for each class. Moreover, we propose a Conv-Agnostic GNNs framework (CAGNNs) to enhance the performance of GNNs on heterophily datasets by learning the neighbor effect for each node. Specifically, we first decouple the feature of each node into the discriminative feature for downstream tasks and the aggregation feature for graph convolution. Then, we propose a shared mixer module for all layers to adaptively evaluate the neighbor effect of each node to incorporate the neighbor information. Experiments are performed on nine well-known benchmark datasets for the node classification task. The results indicate that our framework is able to improve the average prediction performance by 9.81%, 25.81%, and 20.61% for GIN, GAT, and GCN, respectively. Extensive ablation studies and robustness analysis further verify the effectiveness, robustness, and interpretability of our framework.

READ FULL TEXT

page 3

page 7

research
01/26/2023

Graph Neural Networks can Recover the Hidden Features Solely from the Graph Structure

Graph Neural Networks (GNNs) are popular models for graph learning probl...
research
11/19/2020

Scalable Graph Neural Networks for Heterogeneous Graphs

Graph neural networks (GNNs) are a popular class of parametric model for...
research
03/02/2021

Graph Information Vanishing Phenomenon inImplicit Graph Neural Networks

One of the key problems of GNNs is how to describe the importance of nei...
research
10/14/2022

Not All Neighbors Are Worth Attending to: Graph Selective Attention Networks for Semi-supervised Learning

Graph attention networks (GATs) are powerful tools for analyzing graph d...
research
10/08/2021

Stable Prediction on Graphs with Agnostic Distribution Shift

Graph is a flexible and effective tool to represent complex structures i...
research
10/26/2021

TME-BNA: Temporal Motif-Preserving Network Embedding with Bicomponent Neighbor Aggregation

Evolving temporal networks serve as the abstractions of many real-life d...
research
09/02/2022

Rethinking Efficiency and Redundancy in Training Large-scale Graphs

Large-scale graphs are ubiquitous in real-world scenarios and can be tra...

Please sign up or login with your details

Forgot password? Click here to reset