Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction

06/19/2019
by   Christoph Alt, et al.
0

Distantly supervised relation extraction is widely used to extract relational facts from text, but suffers from noisy labels. Current relation extraction methods try to alleviate the noise by multi-instance learning and by providing supporting linguistic and contextual information to more efficiently guide the relation classification. While achieving state-of-the-art results, we observed these models to be biased towards recognizing a limited set of relations with high precision, while ignoring those in the long tail. To address this gap, we utilize a pre-trained language model, the OpenAI Generative Pre-trained Transformer (GPT) [Radford et al., 2018]. The GPT and similar models have been shown to capture semantic and syntactic features, and also a notable amount of "common-sense" knowledge, which we hypothesize are important features for recognizing a more diverse set of relations. By extending the GPT to the distantly supervised setting, and fine-tuning it on the NYT10 dataset, we show that it predicts a larger set of distinct relation types with high confidence. Manual and automated evaluation of our model shows that it achieves a state-of-the-art AUC score of 0.422 on the NYT10 dataset, and performs especially well at higher recall levels.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/01/2021

Improving Distantly-Supervised Relation Extraction through BERT-based Label Instance Embeddings

Distantly-supervised relation extraction (RE) is an effective method to ...
research
06/07/2019

Improving Relation Extraction by Pre-trained Language Representations

Current state-of-the-art relation extraction methods typically rely on a...
research
08/26/2022

GRASP: Guiding model with RelAtional Semantics using Prompt for Dialogue Relation Extraction

The dialogue-based relation extraction (DialogRE) task aims to predict t...
research
05/11/2022

Pre-trained Language Models as Re-Annotators

Annotation noise is widespread in datasets, but manually revising a flaw...
research
11/01/2019

Deep Bidirectional Transformers for Relation Extraction without Supervision

We present a novel framework to deal with relation extraction tasks in c...
research
04/15/2021

A Sample-Based Training Method for Distantly Supervised Relation Extraction with Pre-Trained Transformers

Multiple instance learning (MIL) has become the standard learning paradi...
research
05/08/2023

Revisiting Relation Extraction in the era of Large Language Models

Relation extraction (RE) is the core NLP task of inferring semantic rela...

Please sign up or login with your details

Forgot password? Click here to reset