Do Syntax Trees Help Pre-trained Transformers Extract Information?

08/20/2020
by   Devendra Singh Sachan, et al.
0

Much recent work suggests that incorporating syntax information from dependency trees can improve task-specific transformer models. However, the effect of incorporating dependency tree information into pre-trained transformer models (e.g., BERT) remains unclear, especially given recent studies highlighting how these models implicitly encode syntax. In this work, we systematically study the utility of incorporating dependency trees into pre-trained transformers on three representative information extraction tasks: semantic role labeling (SRL), named entity recognition, and relation extraction. We propose and investigate two distinct strategies for incorporating dependency structure: a late fusion approach, which applies a graph neural network on the output of a transformer, and a joint fusion approach, which infuses syntax structure into the transformer attention layers. These strategies are representative of prior work, but we introduce essential design decisions that are necessary for strong performance. Our empirical analysis demonstrates that these syntax-infused transformers obtain state-of-the-art results on SRL and relation extraction tasks. However, our analysis also reveals a critical shortcoming of these models: we find that their performance gains are highly contingent on the availability of human-annotated dependency parses, which raises important questions regarding the viability of syntax-augmented transformers in real-world applications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/07/2021

Syntax-BERT: Improving Pre-trained Transformers with Syntax Trees

Pre-trained language models like BERT achieve superior performances in v...
research
09/17/2019

Span-based Joint Entity and Relation Extraction with Transformer Pre-training

We introduce SpERT, an attention model for span-based joint entity and r...
research
01/09/2021

Learning Better Sentence Representation with Syntax Information

Sentence semantic understanding is a key topic in the field of natural l...
research
04/10/2019

Simple BERT Models for Relation Extraction and Semantic Role Labeling

We present simple BERT-based models for relation extraction and semantic...
research
03/04/2016

Getting More Out Of Syntax with PropS

Semantic NLP applications often rely on dependency trees to recognize ma...
research
11/15/2022

CSynGEC: Incorporating Constituent-based Syntax for Grammatical Error Correction with a Tailored GEC-Oriented Parser

Recently, Zhang et al. (2022) propose a syntax-aware grammatical error c...
research
11/11/2019

Leveraging Dependency Forest for Neural Medical Relation Extraction

Medical relation extraction discovers relations between entity mentions ...

Please sign up or login with your details

Forgot password? Click here to reset