Graph Joint Attention Networks

02/05/2021
by   Tiantian He, et al.
0

Graph attention networks (GATs) have been recognized as powerful tools for learning in graph structured data. However, how to enable the attention mechanisms in GATs to smoothly consider both structural and feature information is still very challenging. In this paper, we propose Graph Joint Attention Networks (JATs) to address the aforementioned challenge. Different from previous attention-based graph neural networks (GNNs), JATs adopt novel joint attention mechanisms which can automatically determine the relative significance between node features and structural coefficients learned from graph topology, when computing the attention scores. Therefore, representations concerning more structural properties can be inferred by JATs. Besides, we theoretically analyze the expressive power of JATs and further propose an improved strategy for the joint attention mechanisms that enables JATs to reach the upper bound of expressive power which every message-passing GNN can ultimately achieve, i.e., 1-WL test. JATs can thereby be seen as most powerful message-passing GNNs. The proposed neural architecture has been extensively tested on widely used benchmarking datasets, and has been compared with state-of-the-art GNNs for various downstream predictive tasks. Experimental results show that JATs achieve state-of-the-art performance on all the testing datasets.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

01/25/2021

Identity-aware Graph Neural Networks

Message passing Graph Neural Networks (GNNs) provide a powerful modeling...
03/26/2021

Rethinking Graph Neural Network Search from Message-passing

Graph neural networks (GNNs) emerged recently as a standard toolkit for ...
10/24/2020

Pathfinder Discovery Networks for Neural Message Passing

In this work we propose Pathfinder Discovery Networks (PDNs), a method f...
03/12/2022

Equivariant Graph Mechanics Networks with Constraints

Learning to reason about relations and dynamics over multiple interactin...
10/20/2021

SEA: Graph Shell Attention in Graph Neural Networks

A common issue in Graph Neural Networks (GNNs) is known as over-smoothin...
06/10/2021

GNNAutoScale: Scalable and Expressive Graph Neural Networks via Historical Embeddings

We present GNNAutoScale (GAS), a framework for scaling arbitrary message...
02/17/2021

Ego-based Entropy Measures for Structural Representations on Graphs

Machine learning on graph-structured data has attracted high research in...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.