Deep Bidirectional Transformers for Relation Extraction without Supervision

11/01/2019
by   Yannis Papanikolaou, et al.
0

We present a novel framework to deal with relation extraction tasks in cases where there is complete lack of supervision, either in the form of gold annotations, or relations from a knowledge base. Our approach leverages syntactic parsing and pre-trained word embeddings to extract few but precise relations,which are then used to annotate a larger cor-pus, in a manner identical to distant supervision. The resulting data set is employed to fine tune a pre-trained BERT model in order to perform relation extraction. Empirical evaluation on four data sets from the biomedical domain shows that our method significantly outperforms two simple baselines for unsupervised relation extraction and, even if not using any supervision at all, achieves slightly worse results than the state-of-the-art in three out of four data sets. Importantly, we show that it is possible to successfully fine tune a large pre-trained language model with noisy data, as op-posed to previous works that rely on gold data for fine tuning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/18/2020

An Empirical Study of Using Pre-trained BERT Models for Vietnamese Relation Extraction Task at VLSP 2020

In this paper, we present an empirical study of using pre-trained BERT m...
research
05/11/2022

Pre-trained Language Models as Re-Annotators

Annotation noise is widespread in datasets, but manually revising a flaw...
research
06/19/2019

Fine-tuning Pre-Trained Transformer Language Models to Distantly Supervised Relation Extraction

Distantly supervised relation extraction is widely used to extract relat...
research
09/26/2019

Fine-tune Bert for DocRED with Two-step Process

Modelling relations between multiple entities has attracted increasing a...
research
09/26/2019

Biomedical relation extraction with pre-trained language representations and minimal task-specific architecture

This paper presents our participation in the AGAC Track from the 2019 Bi...
research
02/01/2021

Improving Distantly-Supervised Relation Extraction through BERT-based Label Instance Embeddings

Distantly-supervised relation extraction (RE) is an effective method to ...
research
10/24/2020

Effective Distant Supervision for Temporal Relation Extraction

A principal barrier to training temporal relation extraction models in n...

Please sign up or login with your details

Forgot password? Click here to reset