Attention-based Graph Neural Network for Semi-supervised Learning

03/10/2018
by   Kiran K. Thekumparampil, et al.
0

Recently popularized graph neural networks achieve the state-of-the-art accuracy on a number of standard benchmark datasets for graph-based semi-supervised learning, improving significantly over existing approaches. These architectures alternate between a propagation layer that aggregates the hidden states of the local neighborhood and a fully-connected layer. Perhaps surprisingly, we show that a linear model, that removes all the intermediate fully-connected layers, is still able to achieve a performance comparable to the state-of-the-art models. This significantly reduces the number of parameters, which is critical for semi-supervised learning where number of labeled examples are small. This in turn allows a room for designing more innovative propagation layers. Based on this insight, we propose a novel graph neural network that removes all the intermediate fully-connected layers, and replaces the propagation layers with attention mechanisms that respect the structure of the graph. The attention mechanism allows us to learn a dynamic and adaptive local summary of the neighborhood to achieve more accurate predictions. In a number of experiments on benchmark citation networks datasets, we demonstrate that our approach outperforms competing methods. By examining the attention weights among neighbors, we show that our model provides some interesting insights on how neighbors influence each other.

READ FULL TEXT

page 9

page 13

research
09/25/2019

GraphMix: Regularized Training of Graph Neural Networks for Semi-Supervised Learning

We present GraphMix, a regularization technique for Graph Neural Network...
research
03/16/2021

Hebbian Semi-Supervised Learning in a Sample Efficiency Setting

We propose to address the issue of sample efficiency, in Deep Convolutio...
research
01/28/2021

Improving Neural Network Robustness through Neighborhood Preserving Layers

Robustness against adversarial attack in neural networks is an important...
research
11/01/2018

Improving Robustness of Attention Models on Graphs

Machine learning models that can exploit the inherent structure in data ...
research
06/07/2023

Permutation Equivariant Graph Framelets for Heterophilous Semi-supervised Learning

The nature of heterophilous graphs is significantly different with that ...
research
10/11/2022

The Unreasonable Effectiveness of Fully-Connected Layers for Low-Data Regimes

Convolutional neural networks were the standard for solving many compute...
research
10/22/2019

Recurrent Attention Walk for Semi-supervised Classification

In this paper, we study the graph-based semi-supervised learning for cla...

Please sign up or login with your details

Forgot password? Click here to reset