An Anatomy of Graph Neural Networks Going Deep via the Lens of Mutual Information: Exponential Decay vs. Full Preservation

10/10/2019
by   Nezihe Merve Gürel, et al.
28

Graph Convolutional Network (GCN) has attracted intensive interests recently. One major limitation of GCN is that it often cannot benefit from using a deep architecture, while traditional CNN and an alternative Graph Neural Network architecture, namely GraphCNN, often achieve better quality with a deeper neural architecture. How can we explain this phenomenon? In this paper, we take the first step towards answering this question. We first conduct a systematic empirical study on the accuracy of GCN, GraphCNN, and ResNet-18 on 2D images and identified relative importance of different factors in architectural design. This inspired a novel theoretical analysis on the mutual information between the input and the output after l GCN and GraphCNN layers. We identified regimes in which GCN suffers exponentially fast information lose and show that GraphCNN requires a much weaker condition for similar behavior to happen.

READ FULL TEXT
research
12/01/2022

Architectural Implications of Embedding Dimension during GCN on CPU and GPU

Graph Neural Networks (GNNs) are a class of neural networks designed to ...
research
03/12/2018

Probabilistic and Regularized Graph Convolutional Networks

This paper explores the recently proposed Graph Convolutional Network ar...
research
04/26/2021

Graph Neural Networks with Adaptive Frequency Response Filter

Graph Neural Networks have recently become a prevailing paradigm for var...
research
10/15/2020

Bi-GCN: Binary Graph Convolutional Network

Graph Neural Networks (GNNs) have achieved tremendous success in graph r...
research
03/20/2021

Recognizing Predictive Substructures with Subgraph Information Bottleneck

The emergence of Graph Convolutional Network (GCN) has greatly boosted t...
research
07/11/2019

Understanding the Representation Power of Graph Neural Networks in Learning Graph Topology

To deepen our understanding of graph neural networks, we investigate the...
research
03/03/2021

Wide Graph Neural Networks: Aggregation Provably Leads to Exponentially Trainability Loss

Graph convolutional networks (GCNs) and their variants have achieved gre...

Please sign up or login with your details

Forgot password? Click here to reset