PEL-BERT: A Joint Model for Protocol Entity Linking

01/28/2020
by   Shoubin Li, et al.
0

Pre-trained models such as BERT are widely used in NLP tasks and are fine-tuned to improve the performance of various NLP tasks consistently. Nevertheless, the fine-tuned BERT model trained on our protocol corpus still has a weak performance on the Entity Linking (EL) task. In this paper, we propose a model that joints a fine-tuned language model with an RFC Domain Model. Firstly, we design a Protocol Knowledge Base as the guideline for protocol EL. Secondly, we propose a novel model, PEL-BERT, to link named entities in protocols to categories in Protocol Knowledge Base. Finally, we conduct a comprehensive study on the performance of pre-trained language models on descriptive texts and abstract concepts. Experimental results demonstrate that our model achieves state-of-the-art performance in EL on our annotated dataset, outperforming all the baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/27/2020

GREEK-BERT: The Greeks visiting Sesame Street

Transformer-based language models, such as BERT and its variants, have a...
research
05/17/2019

ERNIE: Enhanced Language Representation with Informative Entities

Neural language representation models such as BERT pre-trained on large-...
research
08/31/2023

DictaBERT: A State-of-the-Art BERT Suite for Modern Hebrew

We present DictaBERT, a new state-of-the-art pre-trained BERT model for ...
research
09/29/2020

Gender prediction using limited Twitter Data

Transformer models have shown impressive performance on a variety of NLP...
research
08/21/2019

Revealing the Dark Secrets of BERT

BERT-based architectures currently give state-of-the-art performance on ...
research
06/28/2023

Beyond the Hype: Assessing the Performance, Trustworthiness, and Clinical Suitability of GPT3.5

The use of large language models (LLMs) in healthcare is gaining popular...
research
07/01/2023

THUIR2 at NTCIR-16 Session Search (SS) Task

Our team(THUIR2) participated in both FOSS and POSS subtasks of the NTCI...

Please sign up or login with your details

Forgot password? Click here to reset