Multilinguals at SemEval-2022 Task 11: Transformer Based Architecture for Complex NER

04/05/2022
by   Amit Pandey, et al.
0

We investigate the task of complex NER for the English language. The task is non-trivial due to the semantic ambiguity of the textual structure and the rarity of occurrence of such entities in the prevalent literature. Using pre-trained language models such as BERT, we obtain a competitive performance on this task. We qualitatively analyze the performance of multiple architectures for this task. All our models are able to outperform the baseline by a significant margin. Our best performing model beats the baseline F1-score by over 9

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/14/2022

Multilinguals at SemEval-2022 Task 11: Complex NER in Semantically Ambiguous Settings for Low Resource Languages

We leverage pre-trained language models to solve the task of complex NER...
research
05/03/2022

Predicting Issue Types with seBERT

Pre-trained transformer models are the current state-of-the-art for natu...
research
01/06/2023

OPD@NL4Opt: An ensemble approach for the NER task of the optimization problem

In this paper, we present an ensemble approach for the NL4Opt competitio...
research
05/11/2021

BERT is to NLP what AlexNet is to CV: Can Pre-Trained Language Models Identify Analogies?

Analogies play a central role in human commonsense reasoning. The abilit...
research
06/14/2021

Can BERT Dig It? – Named Entity Recognition for Information Retrieval in the Archaeology Domain

The amount of archaeological literature is growing rapidly. Until recent...
research
10/18/2017

OhioState at IJCNLP-2017 Task 4: Exploring Neural Architectures for Multilingual Customer Feedback Analysis

This paper describes our systems for IJCNLP 2017 Shared Task on Customer...
research
10/06/2022

HealthE: Classifying Entities in Online Textual Health Advice

The processing of entities in natural language is essential to many medi...

Please sign up or login with your details

Forgot password? Click here to reset