Syntax-aware Neural Semantic Role Labeling

07/22/2019
by   Qingrong Xia, et al.
0

Semantic role labeling (SRL), also known as shallow semantic parsing, is an important yet challenging task in NLP. Motivated by the close correlation between syntactic and semantic structures, traditional discrete-feature-based SRL approaches make heavy use of syntactic features. In contrast, deep-neural-network-based approaches usually encode the input sentence as a word sequence without considering the syntactic structures. In this work, we investigate several previous approaches for encoding syntactic trees, and make a thorough study on whether extra syntax-aware representations are beneficial for neural SRL models. Experiments on the benchmark CoNLL-2005 dataset show that syntax-aware SRL approaches can effectively improve performance over a strong baseline with external word representations from ELMo. With the extra syntax-aware representations, our approaches achieve new state-of-the-art 85.6 F1 (single model) and 86.6 F1 (ensemble) on the test data, outperforming the corresponding strong baselines with ELMo by 0.8 and 1.0, respectively. Detailed error analysis are conducted to gain more insights on the investigated approaches.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/12/2019

A Syntax-aware Multi-task Learning Framework for Chinese Semantic Role Labeling

Semantic role labeling (SRL) aims to identify the predicate-argument str...
research
08/29/2019

Shallow Syntax in Deep Water

Shallow syntax provides an approximation of phrase-syntactic structure o...
research
10/24/2019

Syntax-Enhanced Self-Attention-Based Semantic Role Labeling

As a fundamental NLP task, semantic role labeling (SRL) aims to discover...
research
06/01/2019

How to best use Syntax in Semantic Role Labelling

There are many different ways in which external information might be use...
research
11/12/2018

Syntax Helps ELMo Understand Semantics: Is Syntax Still Relevant in a Deep Neural Architecture for SRL?

Do unsupervised methods for learning rich, contextualized token represen...
research
07/16/2021

Exploiting Rich Syntax for Better Knowledge Base Question Answering

Recent studies on Knowledge Base Question Answering (KBQA) have shown gr...
research
06/03/2021

Representing Syntax and Composition with Geometric Transformations

The exploitation of syntactic graphs (SyGs) as a word's context has been...

Please sign up or login with your details

Forgot password? Click here to reset