Two Sides of the Same Coin: Heterophily and Oversmoothing in Graph Convolutional Neural Networks

02/12/2021
by   Yujun Yan, et al.
0

Most graph neural networks (GNN) perform poorly in graphs where neighbors typically have different features/classes (heterophily) and when stacking multiple layers (oversmoothing). These two seemingly unrelated problems have been studied independently, but there is recent empirical evidence that solving one problem may benefit the other. In this work, going beyond empirical observations, we theoretically characterize the connections between heterophily and oversmoothing, both of which lead to indistinguishable node representations. By modeling the change in node representations during message propagation, we theoretically analyze the factors (e.g., degree, heterophily level) that make the representations of nodes from different classes indistinguishable. Our analysis highlights that (1) nodes with high heterophily and nodes with low heterophily and low degrees relative to their neighbors (degree discrepancy) trigger the oversmoothing problem, and (2) allowing "negative" messages between neighbors can decouple the heterophily and oversmoothing problems. Based on our insights, we design a model that addresses the discrepancy in features and degrees between neighbors by incorporating signed messages and learned degree corrections. Our experiments on 9 real networks show that our model achieves state-of-the-art performance under heterophily, and performs comparably to existing GNNs under low heterophily(homophily). It also effectively addresses oversmoothing and even benefits from multiple layers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/08/2023

On Generalized Degree Fairness in Graph Neural Networks

Conventional graph neural networks (GNNs) are often confronted with fair...
research
06/09/2020

On the Bottleneck of Graph Neural Networks and its Practical Implications

Graph neural networks (GNNs) were shown to effectively learn from highly...
research
07/11/2023

Supervised Attention Using Homophily in Graph Neural Networks

Graph neural networks have become the standard approach for dealing with...
research
05/30/2021

Hop-Aware Dimension Optimization for Graph Neural Networks

In Graph Neural Networks (GNNs), the embedding of each node is obtained ...
research
05/24/2022

Asynchronous Neural Networks for Learning in Graphs

This paper studies asynchronous message passing (AMP), a new paradigm fo...
research
10/12/2021

Scalable Consistency Training for Graph Neural Networks via Self-Ensemble Self-Distillation

Consistency training is a popular method to improve deep learning models...
research
05/30/2021

How effective are Graph Neural Networks in Fraud Detection for Network Data?

Graph-based Neural Networks (GNNs) are recent models created for learnin...

Please sign up or login with your details

Forgot password? Click here to reset