Syntax-Aware Graph-to-Graph Transformer for Semantic Role Labelling

04/15/2021
by   Alireza Mohammadshahi, et al.
0

The goal of semantic role labelling (SRL) is to recognise the predicate-argument structure of a sentence. Recent models have shown that syntactic information can enhance the SRL performance, but other syntax-agnostic approaches achieved reasonable performance. The best way to encode syntactic information for the SRL task is still an open question. In this paper, we propose the Syntax-aware Graph-to-Graph Transformer (SynG2G-Tr) architecture, which encodes the syntactic structure with a novel way to input graph relations as embeddings directly into the self-attention mechanism of Transformer. This approach adds a soft bias towards attention patterns that follow the syntactic structure but also allows the model to use this information to learn alternative patterns. We evaluate our model on both dependency-based and span-based SRL datasets, and outperform all previous syntax-aware and syntax-agnostic models in both in-domain and out-of-domain settings, on the CoNLL 2005 and CoNLL 2009 datasets. Our architecture is general and can be applied to encode any graph information for a desired downstream task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/12/2019

Syntax-aware Neural Semantic Role Labeling with Supertags

We introduce a new syntax-aware model for dependency-based semantic role...
research
10/23/2020

GraphSpeech: Syntax-Aware Graph Attention Network For Neural Speech Synthesis

Attention-based end-to-end text-to-speech synthesis (TTS) is superior to...
research
06/03/2021

Representing Syntax and Composition with Geometric Transformations

The exploitation of syntactic graphs (SyGs) as a word's context has been...
research
09/12/2020

Syntax Role for Neural Semantic Role Labeling

Semantic role labeling (SRL) is dedicated to recognizing the semantic pr...
research
07/16/2021

Exploiting Rich Syntax for Better Knowledge Base Question Answering

Recent studies on Knowledge Base Question Answering (KBQA) have shown gr...
research
09/16/2020

Retrofitting Structure-aware Transformer Language Model for End Tasks

We consider retrofitting structure-aware Transformer-based language mode...
research
08/29/2019

Shallow Syntax in Deep Water

Shallow syntax provides an approximation of phrase-syntactic structure o...

Please sign up or login with your details

Forgot password? Click here to reset