Transformer-based Methods for Recognizing Ultra Fine-grained Entities (RUFES)

04/13/2021
by   Emanuela Boros, et al.
0

This paper summarizes the participation of the Laboratoire Informatique, Image et Interaction (L3i laboratory) of the University of La Rochelle in the Recognizing Ultra Fine-grained Entities (RUFES) track within the Text Analysis Conference (TAC) series of evaluation workshops. Our participation relies on two neural-based models, one based on a pre-trained and fine-tuned language model with a stack of Transformer layers for fine-grained entity extraction and one out-of-the-box model for within-document entity coreference. We observe that our approach has great potential in increasing the performance of fine-grained entity recognition. Thus, the future work envisioned is to enhance the ability of the models following additional experiments and a deeper analysis of the results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/08/2021

Ultra-Fine Entity Typing with Weak Supervision from a Masked Language Model

Recently, there is an effort to extend fine-grained entity typing by usi...
research
04/30/2020

Interpretable Entity Representations through Large-Scale Typing

In standard methodology for natural language processing, entities in tex...
research
09/21/2021

Extracting Fine-Grained Knowledge Graphs of Scientific Claims: Dataset and Transformer-Based Results

Recent transformer-based approaches demonstrate promising results on rel...
research
09/25/2022

Application of Deep Learning in Generating Structured Radiology Reports: A Transformer-Based Technique

Since radiology reports needed for clinical practice and research are wr...
research
08/21/2023

Software Entity Recognition with Noise-Robust Learning

Recognizing software entities such as library names from free-form text ...
research
10/10/2020

Leveraging Spatial Information in Radiology Reports for Ischemic Stroke Phenotyping

Classifying fine-grained ischemic stroke phenotypes relies on identifyin...
research
05/06/2022

When a sentence does not introduce a discourse entity, Transformer-based models still sometimes refer to it

Understanding longer narratives or participating in conversations requir...

Please sign up or login with your details

Forgot password? Click here to reset