Is Homophily a Necessity for Graph Neural Networks?

by   Yao Ma, et al.

Graph neural networks (GNNs) have shown great prowess in learning representations suitable for numerous graph-based machine learning tasks. When applied to semi-supervised node classification, GNNs are widely believed to work well due to the homophily assumption (“like attracts like”), and fail to generalize to heterophilous graphs where dissimilar nodes connect. Recent works design new architectures to overcome such heterophily-related limitations, citing poor baseline performance and new architecture improvements on a few heterophilous graph benchmark datasets as evidence for this notion. In our experiments, we empirically find that standard graph convolutional networks (GCNs) can actually achieve better performance than such carefully designed methods on some commonly used heterophilous graphs. This motivates us to reconsider whether homophily is truly necessary for good GNN performance. We find that this claim is not quite true, and in fact, GCNs can achieve strong performance on heterophilous graphs under certain conditions. Our work carefully characterizes these conditions, and provides supporting theoretical understanding and empirical observations. Finally, we examine existing heterophilous graphs benchmarks and reconcile how the GCN (under)performs on them based on this understanding.


page 7

page 9

page 19


What Do Graph Convolutional Neural Networks Learn?

Graph neural networks (GNNs) have gained traction over the past few year...

Triple Sparsification of Graph Convolutional Networks without Sacrificing the Accuracy

Graph Neural Networks (GNNs) are widely used to perform different machin...

Node Masking: Making Graph Neural Networks Generalize and Scale Better

Graph Neural Networks (GNNs) have received a lot of interest in the rece...

Simplifying Node Classification on Heterophilous Graphs with Compatible Label Propagation

Graph Neural Networks (GNNs) have been predominant for graph learning ta...

Graph Entropy Guided Node Embedding Dimension Selection for Graph Neural Networks

Graph representation learning has achieved great success in many areas, ...

RAN-GNNs: breaking the capacity limits of graph neural networks

Graph neural networks have become a staple in problems addressing learni...

Understanding attention in graph neural networks

We aim to better understand attention over nodes in graph neural network...