Understanding the Message Passing in Graph Neural Networks via Power Iteration

05/30/2020
by   Xue Li, et al.
0

The mechanism of message passing in graph neural networks(GNNs) is still mysterious for the literature. No one, to our knowledge, has given another possible theoretical origin for GNNs apart from convolutional neural networks. Somewhat to our surprise, the message passing can be best understood in terms of the power iteration. By removing activation functions and layer weights of GNNs, we propose power iteration clustering (SPIC) models which are naturally interpretable and scalable. The experiment shows our models extend the existing GNNs and enhance its capability of processing random featured networks. Moreover, we demonstrate the redundancy of some state-of-the-art GNNs in designing and define a lower limit for model evaluation by randomly initializing the aggregator of message passing. All the findings in this paper push the boundaries of our understanding of neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/22/2022

Message passing all the way up

The message passing framework is the foundation of the immense success e...
research
02/14/2020

Generalization and Representational Limits of Graph Neural Networks

We address two fundamental questions about graph neural networks (GNNs)....
research
07/20/2022

ReFactorGNNs: Revisiting Factorisation-based Models from a Message-Passing Perspective

Factorisation-based Models (FMs), such as DistMult, have enjoyed endurin...
research
07/05/2021

Elastic Graph Neural Networks

While many existing graph neural networks (GNNs) have been proven to per...
research
11/19/2022

Tired of Over-smoothing? Stress Graph Drawing Is All You Need!

In designing and applying graph neural networks, we often fall into some...
research
06/07/2022

Utility of Equivariant Message Passing in Cortical Mesh Segmentation

The automated segmentation of cortical areas has been a long-standing ch...
research
04/06/2020

Let's Agree to Degree: Comparing Graph Convolutional Networks in the Message-Passing Framework

In this paper we cast neural networks defined on graphs as message-passi...

Please sign up or login with your details

Forgot password? Click here to reset