Message Passing Attention Networks for Document Understanding

08/17/2019
by   Giannis Nikolentzos, et al.
0

Most graph neural networks can be described in terms of message passing, vertex update, and readout functions. In this paper, we represent documents as word co-occurrence networks and propose an application of the message passing framework to NLP, the Message Passing Attention network for Document understanding (MPAD). We also propose several hierarchical variants of MPAD. Experiments conducted on 10 standard text classification datasets show that our architectures are competitive with the state-of-the-art. Ablation studies reveal further insights about the impact of the different components on performance. Code and data are publicly available.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/28/2023

Framelet Message Passing

Graph neural networks (GNNs) have achieved champion in wide applications...
research
04/21/2023

Convergence of Message Passing Graph Neural Networks with Generic Aggregation On Large Random Graphs

We study the convergence of message passing graph neural networks on ran...
research
05/31/2021

Neural message passing for joint paratope-epitope prediction

Antibodies are proteins in the immune system which bind to antigens to d...
research
04/26/2022

Function-words Enhanced Attention Networks for Few-Shot Inverse Relation Classification

The relation classification is to identify semantic relations between tw...
research
02/21/2023

AttentionMixer: An Accurate and Interpretable Framework for Process Monitoring

An accurate and explainable automatic monitoring system is critical for ...
research
07/14/2022

Antibody-Antigen Docking and Design via Hierarchical Equivariant Refinement

Computational antibody design seeks to automatically create an antibody ...
research
05/09/2021

Dispatcher: A Message-Passing Approach To Language Modelling

This paper proposes a message-passing mechanism to address language mode...

Please sign up or login with your details

Forgot password? Click here to reset