Edge Entropy as an Indicator of the Effectiveness of GNNs over CNNs for Node Classification

12/16/2020
by   Lavender Yao Jiang, et al.
0

Graph neural networks (GNNs) extend convolutional neural networks (CNNs) to graph-based data. A question that arises is how much performance improvement does the underlying graph structure in the GNN provide over the CNN (that ignores this graph structure). To address this question, we introduce edge entropy and evaluate how good an indicator it is for possible performance improvement of GNNs over CNNs. Our results on node classification with synthetic and real datasets show that lower values of edge entropy predict larger expected performance gains of GNNs over CNNs, and, conversely, higher edge entropy leads to expected smaller improvement gains.

READ FULL TEXT

page 1

page 2

page 3

page 4

03/11/2021

Should Graph Neural Networks Use Features, Edges, Or Both?

Graph Neural Networks (GNNs) are the first choice for learning algorithm...
06/30/2020

Graph Neural Networks Including Sparse Interpretability

Graph Neural Networks (GNNs) are versatile, powerful machine learning me...
05/23/2022

Learning heterophilious edge to drop: A general framework for boosting graph neural networks

Graph Neural Networks (GNNs) aim at integrating node contents with graph...
06/30/2020

Graph Neural Networks for Leveraging Industrial Equipment Structure: An application to Remaining Useful Life Estimation

Automated equipment health monitoring from streaming multisensor time-se...
11/22/2020

GNNVis: A Visual Analytics Approach for Prediction Error Diagnosis of Graph Neural Networks

Graph Neural Networks (GNNs) aim to extend deep learning techniques to g...
07/22/2022

Understanding Non-linearity in Graph Neural Networks from the Bayesian-Inference Perspective

Graph neural networks (GNNs) have shown superiority in many prediction t...
10/06/2020

Directional Graph Networks

In order to overcome the expressive limitations of graph neural networks...