Neural Sheaf Diffusion: A Topological Perspective on Heterophily and Oversmoothing in GNNs

02/09/2022
by   Cristian Bodnar, et al.
120

Cellular sheaves equip graphs with "geometrical" structure by assigning vector spaces and linear maps to nodes and edges. Graph Neural Networks (GNNs) implicitly assume a graph with a trivial underlying sheaf. This choice is reflected in the structure of the graph Laplacian operator, the properties of the associated diffusion equation, and the characteristics of the convolutional models that discretise this equation. In this paper, we use cellular sheaf theory to show that the underlying geometry of the graph is deeply linked with the performance of GNNs in heterophilic settings and their oversmoothing behaviour. By considering a hierarchy of increasingly general sheaves, we study how the ability of the sheaf diffusion process to achieve linear separation of the classes in the infinite time limit expands. At the same time, we prove that when the sheaf is non-trivial, discretised parametric diffusion processes have greater control than GNNs over their asymptotic behaviour. On the practical side, we study how sheaves can be learned from data. The resulting sheaf diffusion models have many desirable properties that address the limitations of classical graph diffusion equations (and corresponding GNN models) and obtain state-of-the-art results in heterophilic settings. Overall, our work provides new connections between GNNs and algebraic topology and would be of interest to both fields.

READ FULL TEXT

Authors

page 22

page 23

06/21/2021

GRAND: Graph Neural Diffusion

We present Graph Neural Diffusion (GRAND) that approaches deep learning ...
09/04/2020

Rethinking Graph Regularization For Graph Neural Networks

The graph Laplacian regularization term is usually used in semi-supervis...
05/10/2021

Optimization of Graph Neural Networks: Implicit Acceleration by Skip Connections and More Depth

Graph Neural Networks (GNNs) have been studied through the lens of expre...
06/22/2021

Continuous-Depth Neural Models for Dynamic Graph Prediction

We introduce the framework of continuous-depth graph neural networks (GN...
04/10/2022

Expressiveness and Approximation Properties of Graph Neural Networks

Characterizing the separation power of graph neural networks (GNNs) prov...
04/30/2022

Graph Anisotropic Diffusion

Traditional Graph Neural Networks (GNNs) rely on message passing, which ...
05/05/2020

Deep Lagrangian Constraint-based Propagation in Graph Neural Networks

Several real-world applications are characterized by data that exhibit a...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.