How to Find Your Friendly Neighborhood: Graph Attention Design with Self-Supervision

04/11/2022
by   Dongkwan Kim, et al.
4

Attention mechanism in graph neural networks is designed to assign larger weights to important neighbor nodes for better representation. However, what graph attention learns is not understood well, particularly when graphs are noisy. In this paper, we propose a self-supervised graph attention network (SuperGAT), an improved graph attention model for noisy graphs. Specifically, we exploit two attention forms compatible with a self-supervised task to predict edges, whose presence and absence contain the inherent information about the importance of the relationships between nodes. By encoding edges, SuperGAT learns more expressive attention in distinguishing mislinked neighbors. We find two graph characteristics influence the effectiveness of attention forms and self-supervision: homophily and average degree. Thus, our recipe provides guidance on which attention design to use when those two graph characteristics are known. Our experiment on 17 real-world datasets demonstrates that our recipe generalizes across 15 datasets of them, and our models designed by recipe show improved performance over baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/16/2021

Self-Supervised Dynamic Graph Representation Learning via Temporal Subgraph Contrast

Self-supervised learning on graphs has recently drawn a lot of attention...
research
01/10/2022

Cross-view Self-Supervised Learning on Heterogeneous Graph Neural Network via Bootstrapping

Heterogeneous graph neural networks can represent information of heterog...
research
09/03/2020

CAGNN: Cluster-Aware Graph Neural Networks for Unsupervised Graph Representation Learning

Unsupervised graph representation learning aims to learn low-dimensional...
research
12/10/2020

Image-Graph-Image Translation via Auto-Encoding

This work presents the first convolutional neural network that learns an...
research
08/09/2023

Speaker Recognition Using Isomorphic Graph Attention Network Based Pooling on Self-Supervised Representation

The emergence of self-supervised representation (i.e., wav2vec 2.0) allo...
research
04/08/2020

Improving Expressivity of Graph Neural Networks

We propose a Graph Neural Network with greater expressive power than com...
research
03/06/2023

DEDGAT: Dual Embedding of Directed Graph Attention Networks for Detecting Financial Risk

Graph representation plays an important role in the field of financial r...

Please sign up or login with your details

Forgot password? Click here to reset