An Empirical Study of Using Pre-trained BERT Models for Vietnamese Relation Extraction Task at VLSP 2020

12/18/2020
by   Pham Quang Nhat Minh, et al.
0

In this paper, we present an empirical study of using pre-trained BERT models for relation extraction task at VLSP 2020 Evaluation Campaign. We applied two state-of-the-art BERT-based models: R-BERT and BERT model with entity starts. For each model, we compared two pre-trained BERT models: FPTAI/vibert and NlpHUST/vibert4news. We found that NlpHUST/vibert4news model significantly outperforms FPTAI/vibert for Vietnamese relation extraction task. Finally, we proposed a simple ensemble model which combines R-BERT and BERT with entity starts. Our proposed ensemble model slightly improved against two single models on the development data provided by the task organizers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/25/2021

Does constituency analysis enhance domain-specific pre-trained BERT models for relation extraction?

Recently many studies have been conducted on the topic of relation extra...
research
09/17/2019

Span-based Joint Entity and Relation Extraction with Transformer Pre-training

We introduce SpERT, an attention model for span-based joint entity and r...
research
08/27/2020

Entity and Evidence Guided Relation Extraction for DocRED

Document-level relation extraction is a challenging task which requires ...
research
11/01/2019

Deep Bidirectional Transformers for Relation Extraction without Supervision

We present a novel framework to deal with relation extraction tasks in c...
research
04/10/2019

Simple BERT Models for Relation Extraction and Semantic Role Labeling

We present simple BERT-based models for relation extraction and semantic...
research
04/15/2021

A Sample-Based Training Method for Distantly Supervised Relation Extraction with Pre-Trained Transformers

Multiple instance learning (MIL) has become the standard learning paradi...
research
09/26/2019

Biomedical relation extraction with pre-trained language representations and minimal task-specific architecture

This paper presents our participation in the AGAC Track from the 2019 Bi...

Please sign up or login with your details

Forgot password? Click here to reset