An Empirical Study of Using Pre-trained BERT Models for Vietnamese Relation Extraction Task at VLSP 2020

12/18/2020
by   Pham Quang Nhat Minh, et al.
0

In this paper, we present an empirical study of using pre-trained BERT models for relation extraction task at VLSP 2020 Evaluation Campaign. We applied two state-of-the-art BERT-based models: R-BERT and BERT model with entity starts. For each model, we compared two pre-trained BERT models: FPTAI/vibert and NlpHUST/vibert4news. We found that NlpHUST/vibert4news model significantly outperforms FPTAI/vibert for Vietnamese relation extraction task. Finally, we proposed a simple ensemble model which combines R-BERT and BERT with entity starts. Our proposed ensemble model slightly improved against two single models on the development data provided by the task organizers.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/25/2021

Does constituency analysis enhance domain-specific pre-trained BERT models for relation extraction?

Recently many studies have been conducted on the topic of relation extra...
09/17/2019

Span-based Joint Entity and Relation Extraction with Transformer Pre-training

We introduce SpERT, an attention model for span-based joint entity and r...
08/27/2020

Entity and Evidence Guided Relation Extraction for DocRED

Document-level relation extraction is a challenging task which requires ...
11/01/2019

Deep Bidirectional Transformers for Relation Extraction without Supervision

We present a novel framework to deal with relation extraction tasks in c...
04/10/2019

Simple BERT Models for Relation Extraction and Semantic Role Labeling

We present simple BERT-based models for relation extraction and semantic...
11/30/2021

Text Mining Drug/Chemical-Protein Interactions using an Ensemble of BERT and T5 Based Models

In Track-1 of the BioCreative VII Challenge participants are asked to id...
04/15/2021

A Sample-Based Training Method for Distantly Supervised Relation Extraction with Pre-Trained Transformers

Multiple instance learning (MIL) has become the standard learning paradi...