Improving Relation Extraction by Pre-trained Language Representations

06/07/2019
by   Christoph Alt, et al.
0

Current state-of-the-art relation extraction methods typically rely on a set of lexical, syntactic, and semantic features, explicitly computed in a pre-processing step. Training feature extraction models requires additional annotated language resources, which severely restricts the applicability and portability of relation extraction to novel languages. Similarly, pre-processing introduces an additional source of error. To address these limitations, we introduce TRE, a Transformer for Relation Extraction, extending the OpenAI Generative Pre-trained Transformer [Radford et al., 2018]. Unlike previous relation extraction models, TRE uses pre-trained deep language representations instead of explicit linguistic features to inform the relation classification and combines it with the self-attentive Transformer architecture to effectively model long-range dependencies between entity mentions. TRE allows us to learn implicit linguistic features solely from plain text corpora by unsupervised pre-training, before fine-tuning the learned language representations on the relation extraction task. TRE obtains a new state-of-the-art result on the TACRED and SemEval 2010 Task 8 datasets, achieving a test F1 of 67.4 and 87.1, respectively. Furthermore, we observe a significant increase in sample efficiency. With only 20 examples, TRE matches the performance of our baselines and our model trained from scratch on 100 experiments, and source code.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/19/2019

Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction

Distantly supervised relation extraction is widely used to extract relat...
research
06/19/2019

Reflex: Flexible Framework for Relation Extraction in Multiple Domains

Systematic comparison of methods for relation extraction (RE) is difficu...
research
04/08/2020

Downstream Model Design of Pre-trained Language Model for Relation Extraction Task

Supervised relation extraction methods based on deep neural network play...
research
05/11/2022

Pre-trained Language Models as Re-Annotators

Annotation noise is widespread in datasets, but manually revising a flaw...
research
12/29/2022

Reviewing Labels: Label Graph Network with Top-k Prediction Set for Relation Extraction

The typical way for relation extraction is fine-tuning large pre-trained...
research
02/04/2019

Extracting Multiple-Relations in One-Pass with Pre-Trained Transformers

Most approaches to extraction multiple relations from a paragraph requir...
research
04/17/2020

Probing Linguistic Features of Sentence-Level Representations in Neural Relation Extraction

Despite the recent progress, little is known about the features captured...

Please sign up or login with your details

Forgot password? Click here to reset