Multi-scale Graph Convolutional Networks with Self-Attention

12/04/2021
by   Zhilong Xiong, et al.
0

Graph convolutional networks (GCNs) have achieved remarkable learning ability for dealing with various graph structural data recently. In general, deep GCNs do not work well since graph convolution in conventional GCNs is a special form of Laplacian smoothing, which makes the representation of different nodes indistinguishable. In the literature, multi-scale information was employed in GCNs to enhance the expressive power of GCNs. However, over-smoothing phenomenon as a crucial issue of GCNs remains to be solved and investigated. In this paper, we propose two novel multi-scale GCN frameworks by incorporating self-attention mechanism and multi-scale information into the design of GCNs. Our methods greatly improve the computational efficiency and prediction accuracy of the GCNs model. Extensive experiments on both node classification and graph classification demonstrate the effectiveness over several state-of-the-art GCNs. Notably, the proposed two architectures can efficiently mitigate the over-smoothing problem of GCNs, and the layer of our model can even be increased to 64.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/16/2021

SStaGCN: Simplified stacking based graph convolutional networks

Graph convolutional network (GCN) is a powerful model studied broadly in...
research
06/05/2019

Break the Ceiling: Stronger Multi-scale Deep Graph Convolutional Networks

Recently, neural network based approaches have achieved significant impr...
research
01/06/2019

LanczosNet: Multi-Scale Deep Graph Convolutional Networks

We propose the Lanczos network (LanczosNet), which uses the Lanczos algo...
research
10/21/2021

FDGATII : Fast Dynamic Graph Attention with Initial Residual and Identity Mapping

While Graph Neural Networks have gained popularity in multiple domains, ...
research
12/28/2020

Lip-reading with Hierarchical Pyramidal Convolution and Self-Attention

In this paper, we propose a novel deep learning architecture to improvin...
research
02/17/2022

Revisiting Over-smoothing in BERT from the Perspective of Graph

Recently over-smoothing phenomenon of Transformer-based models is observ...
research
12/22/2021

SkipNode: On Alleviating Over-smoothing for Deep Graph Convolutional Networks

Over-smoothing is a challenging problem, which degrades the performance ...

Please sign up or login with your details

Forgot password? Click here to reset