Fine-tune Bert for DocRED with Two-step Process

09/26/2019
by   Hong Wang, et al.
0

Modelling relations between multiple entities has attracted increasing attention recently, and a new dataset called DocRED has been collected in order to accelerate the research on the document-level relation extraction. Current baselines for this task uses BiLSTM to encode the whole document and are trained from scratch. We argue that such simple baselines are not strong enough to model to complex interaction between entities. In this paper, we further apply a pre-trained language model (BERT) to provide a stronger baseline for this task. We also find that solving this task in phases can further improve the performance. The first step is to predict whether or not two entities have a relation, the second step is to predict the specific relation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/27/2020

Entity and Evidence Guided Relation Extraction for DocRED

Document-level relation extraction is a challenging task which requires ...
research
05/20/2019

Enriching Pre-trained Language Model with Entity Information for Relation Classification

Relation classification is an important NLP task to extract relations be...
research
11/01/2019

Deep Bidirectional Transformers for Relation Extraction without Supervision

We present a novel framework to deal with relation extraction tasks in c...
research
04/08/2020

Downstream Model Design of Pre-trained Language Model for Relation Extraction Task

Supervised relation extraction methods based on deep neural network play...
research
12/04/2020

DDRel: A New Dataset for Interpersonal Relation Classification in Dyadic Dialogues

Interpersonal language style shifting in dialogues is an interesting and...
research
09/26/2019

Biomedical relation extraction with pre-trained language representations and minimal task-specific architecture

This paper presents our participation in the AGAC Track from the 2019 Bi...
research
08/24/2019

BERT for Coreference Resolution: Baselines and Analysis

We apply BERT to coreference resolution, achieving strong improvements o...

Please sign up or login with your details

Forgot password? Click here to reset