Causal-Based Supervision of Attention in Graph Neural Network: A Better and Simpler Choice towards Powerful Attention

05/22/2023
by   Hongjun Wang, et al.
0

In recent years, attention mechanisms have demonstrated significant potential in the field of graph representation learning. However, while variants of attention-based GNNs are setting new benchmarks for numerous real-world datasets, recent works have pointed out that their induced attentions are less robust and generalizable against noisy graphs due to the lack of direct supervision. In this paper, we present a new framework that utilizes the tool of causality to provide a powerful supervision signal for the learning process of attention functions. Specifically, we estimate the direct causal effect of attention on the final prediction and then maximize such effect to guide attention to attend to more meaningful neighbors. Our method can serve as a plug-and-play module for any canonical attention-based GNNs in an end-to-end fashion. Extensive experiments on a wide range of benchmark datasets illustrated that, by directly supervising attention with our method, the model is able to converge faster with a clearer decision boundary, and thus yields better performances.

READ FULL TEXT
research
06/15/2020

Fast Graph Attention Networks Using Effective Resistance Based Graph Sparsification

The attention mechanism has demonstrated superior performance for infere...
research
07/04/2019

Improving Attention Mechanism in Graph Neural Networks via Cardinality Preservation

Graph Neural Networks (GNNs) are powerful to learn the representation of...
research
08/19/2021

Counterfactual Attention Learning for Fine-Grained Visual Categorization and Re-identification

Attention mechanism has demonstrated great potential in fine-grained vis...
research
01/30/2023

Causality-based CTR Prediction using Graph Neural Networks

As a prevalent problem in online advertising, CTR prediction has attract...
research
12/30/2021

Deconfounded Training for Graph Neural Networks

Learning powerful representations is one central theme of graph neural n...
research
10/25/2019

Improving Graph Attention Networks with Large Margin-based Constraints

Graph Attention Networks (GATs) are the state-of-the-art neural architec...

Please sign up or login with your details

Forgot password? Click here to reset