Multilingual Named Entity Recognition Using Pretrained Embeddings, Attention Mechanism and NCRF

06/21/2019
by   Anton A. Emelyanov, et al.
0

In this paper we tackle multilingual named entity recognition task. We use the BERT Language Model as embeddings with bidirectional recurrent network, attention, and NCRF on the top. We apply multilingual BERT only as embedder without any fine-tuning. We test out model on the dataset of the BSNLP shared task, which consists of texts in Bulgarian, Czech, Polish and Russian languages.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/05/2023

LLM-RM at SemEval-2023 Task 2: Multilingual Complex NER using XLM-RoBERTa

Named Entity Recognition(NER) is a task of recognizing entities at a tok...
research
09/17/2021

The futility of STILTs for the classification of lexical borrowings in Spanish

The first edition of the IberLEF 2021 shared task on automatic detection...
research
06/04/2019

Sequence Tagging with Contextual and Non-Contextual Subword Representations: A Multilingual Evaluation

Pretrained contextual and non-contextual subword embeddings have become ...
research
07/02/2020

NLNDE: Enhancing Neural Sequence Taggers with Attention and Noisy Channel for Robust Pharmacological Entity Detection

Named entity recognition has been extensively studied on English news te...
research
05/12/2021

Priberam Labs at the NTCIR-15 SHINRA2020-ML: Classification Task

Wikipedia is an online encyclopedia available in 285 languages. It compo...
research
05/31/2022

hmBERT: Historical Multilingual Language Models for Named Entity Recognition

Compared to standard Named Entity Recognition (NER), identifying persons...
research
10/18/2016

Vietnamese Named Entity Recognition using Token Regular Expressions and Bidirectional Inference

This paper describes an efficient approach to improve the accuracy of a ...

Please sign up or login with your details

Forgot password? Click here to reset