Improving Graph Attention Networks with Large Margin-based Constraints

10/25/2019
by   Guangtao Wang, et al.
18

Graph Attention Networks (GATs) are the state-of-the-art neural architecture for representation learning with graphs. GATs learn attention functions that assign weights to nodes so that different nodes have different influences in the feature aggregation steps. In practice, however, induced attention functions are prone to over-fitting due to the increasing number of parameters and the lack of direct supervision on attention weights. GATs also suffer from over-smoothing at the decision boundary of nodes. Here we propose a framework to address their weaknesses via margin-based constraints on attention during training. We first theoretically demonstrate the over-smoothing behavior of GATs and then develop an approach using constraint on the attention weights according to the class boundary and feature aggregation pattern. Furthermore, to alleviate the over-fitting problem, we propose additional constraints on the graph structure. Extensive experiments and ablation studies on common benchmark datasets demonstrate the effectiveness of our method, which leads to significant improvements over the previous state-of-the-art graph attention methods on all datasets.

READ FULL TEXT
research
08/28/2020

Graph Convolutional Neural Networks with Node Transition Probability-based Message Passing and DropNode Regularization

Graph convolutional neural networks (GCNNs) have received much attention...
research
02/22/2023

HINormer: Representation Learning On Heterogeneous Information Networks with Graph Transformer

Recent studies have highlighted the limitations of message-passing based...
research
09/29/2020

Direct Multi-hop Attention based Graph Neural Network

Introducing self-attention mechanism in graph neural networks (GNNs) ach...
research
05/14/2019

Graph Attribute Aggregation Network with Progressive Margin Folding

Graph convolutional neural networks (GCNNs) have been attracting increas...
research
05/22/2023

Causal-Based Supervision of Attention in Graph Neural Network: A Better and Simpler Choice towards Powerful Attention

In recent years, attention mechanisms have demonstrated significant pote...
research
05/22/2023

Distributed Learning over Networks with Graph-Attention-Based Personalization

In conventional distributed learning over a network, multiple agents col...
research
12/22/2021

RepBin: Constraint-based Graph Representation Learning for Metagenomic Binning

Mixed communities of organisms are found in many environments (from the ...

Please sign up or login with your details

Forgot password? Click here to reset