Graph Convolutional Neural Networks with Node Transition Probability-based Message Passing and DropNode Regularization

08/28/2020
by   Tien Huu Do, et al.
0

Graph convolutional neural networks (GCNNs) have received much attention recently, owing to their capability in handling graph-structured data. Among the existing GCNNs, many methods can be viewed as instances of a neural message passing motif; features of nodes are passed around their neighbors, aggregated and transformed to produce better nodes' representations. Nevertheless, these methods seldom use node transition probabilities, a measure that has been found useful in exploring graphs. Furthermore, when the transition probabilities are used, their transition direction is often improperly considered in the feature aggregation step, resulting in an inefficient weighting scheme. In addition, although a great number of GCNN models with increasing level of complexity have been introduced, the GCNNs often suffer from over-fitting when being trained on small graphs. Another issue of the GCNNs is over-smoothing, which tends to make nodes' representations indistinguishable. This work presents a new method to improve the message passing process based on node transition probabilities by properly considering the transition direction, leading to a better weighting scheme in nodes' features aggregation compared to the existing counterpart. Moreover, we propose a novel regularization method termed DropNode to address the over-fitting and over-smoothing issues simultaneously. DropNode randomly discards part of a graph, thus it creates multiple deformed versions of the graph, leading to data augmentation regularization effect. Additionally, DropNode lessens the connectivity of the graph, mitigating the effect of over-smoothing in deep GCNNs. Extensive experiments on eight benchmark datasets for node and graph classification tasks demonstrate the effectiveness of the proposed methods in comparison with the state of the art.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/01/2022

Memory-based Message Passing: Decoupling the Message for Propogation from Discrimination

Message passing is a fundamental procedure for graph neural networks in ...
research
10/25/2019

Improving Graph Attention Networks with Large Margin-based Constraints

Graph Attention Networks (GATs) are the state-of-the-art neural architec...
research
07/25/2019

The Truly Deep Graph Convolutional Networks for Node Classification

Existing Graph Convolutional Networks (GCNs) are shallow---the number of...
research
07/25/2019

DropEdge: Towards the Very Deep Graph Convolutional Networks for Node Classification

Existing Graph Convolutional Networks (GCNs) are shallow---the number of...
research
12/07/2020

NCGNN: Node-level Capsule Graph Neural Network

Message passing has evolved as an effective tool for designing Graph Neu...
research
12/05/2022

Graph Convolutional Neural Networks with Diverse Negative Samples via Decomposed Determinant Point Processes

Graph convolutional networks (GCNs) have achieved great success in graph...
research
05/14/2023

Addressing Heterophily in Node Classification with Graph Echo State Networks

Node classification tasks on graphs are addressed via fully-trained deep...

Please sign up or login with your details

Forgot password? Click here to reset