VTCC-NLP at NL4Opt competition subtask 1: An Ensemble Pre-trained language models for Named Entity Recognition

12/14/2022
by   Xuan-Dung Doan, et al.
0

We propose a combined three pre-trained language models (XLM-R, BART, and DeBERTa-V3) as an empower of contextualized embedding for named entity recognition. Our model achieves a 92.9 on the leaderboard at NL4Opt competition subtask 1.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/29/2022

SFE-AI at SemEval-2022 Task 11: Low-Resource Named Entity Recognition using Large Pre-trained Language Models

Large scale pre-training models have been widely used in named entity re...
research
06/18/2019

Towards Robust Named Entity Recognition for Historic German

Recent advances in language modeling using deep neural networks have sho...
research
08/15/2023

Informed Named Entity Recognition Decoding for Generative Language Models

Ever-larger language models with ever-increasing capabilities are by now...
research
06/15/2022

TOKEN is a MASK: Few-shot Named Entity Recognition with Pre-trained Language Models

Transferring knowledge from one domain to another is of practical import...
research
11/08/2019

SEPT: Improving Scientific Named Entity Recognition with Span Representation

We introduce a new scientific named entity recognizer called SEPT, which...
research
11/07/2022

Reconciliation of Pre-trained Models and Prototypical Neural Networks in Few-shot Named Entity Recognition

Incorporating large-scale pre-trained models with the prototypical neura...
research
12/20/2019

End-to-end Named Entity Recognition and Relation Extraction using Pre-trained Language Models

Named entity recognition (NER) and relation extraction (RE) are two impo...

Please sign up or login with your details

Forgot password? Click here to reset