Multi-Level Attention Pooling for Graph Neural Networks: Unifying Graph Representations with Multiple Localities

03/02/2021
by   Takeshi D. Itoh, et al.
0

Graph neural networks (GNNs) have been widely used to learn vector representation of graph-structured data and achieved better task performance than conventional methods. The foundation of GNNs is the message passing procedure, which propagates the information in a node to its neighbors. Since this procedure proceeds one step per layer, the scope of the information propagation among nodes is small in the early layers, and it expands toward the later layers. The problem here is that the model performances degrade as the number of layers increases. A potential cause is that deep GNN models tend to lose the nodes' local information, which would be essential for good model performances, through many message passing steps. To solve this so-called oversmoothing problem, we propose a multi-level attention pooling (MLAP) architecture. It has an attention pooling layer for each message passing step and computes the final graph representation by unifying the layer-wise graph representations. The MLAP architecture allows models to utilize the structural information of graphs with multiple levels of localities because it preserves layer-wise information before losing them due to oversmoothing. Results of our experiments show that the MLAP architecture improves deeper models' performance in graph classification tasks compared to the baseline architectures. In addition, analyses on the layer-wise graph representations suggest that MLAP has the potential to learn graph representations with improved class discriminability by aggregating information with multiple levels of localities.

READ FULL TEXT
research
02/08/2022

Boosting Graph Neural Networks by Injecting Pooling in Message Passing

There has been tremendous success in the field of graph neural networks ...
research
01/21/2023

Is Signed Message Essential for Graph Neural Networks?

Message-passing Graph Neural Networks (GNNs), which collect information ...
research
06/25/2018

A Unified Model with Structured Output for Fashion Images Classification

A picture is worth a thousand words. Albeit a cliché, for the fashion in...
research
03/02/2023

Steering Graph Neural Networks with Pinning Control

In the semi-supervised setting where labeled data are largely limited, i...
research
01/19/2023

GIPA++: A General Information Propagation Algorithm for Graph Learning

Graph neural networks (GNNs) have been widely used in graph-structured d...
research
05/31/2022

Template based Graph Neural Network with Optimal Transport Distances

Current Graph Neural Networks (GNN) architectures generally rely on two ...
research
10/24/2020

Pathfinder Discovery Networks for Neural Message Passing

In this work we propose Pathfinder Discovery Networks (PDNs), a method f...

Please sign up or login with your details

Forgot password? Click here to reset