Application of Pre-training Models in Named Entity Recognition

02/09/2020
by   Yu Wang, et al.
0

Named Entity Recognition (NER) is a fundamental Natural Language Processing (NLP) task to extract entities from unstructured data. The previous methods for NER were based on machine learning or deep learning. Recently, pre-training models have significantly improved performance on multiple NLP tasks. In this paper, firstly, we introduce the architecture and pre-training tasks of four common pre-training models: BERT, ERNIE, ERNIE2.0-tiny, and RoBERTa. Then, we apply these pre-training models to a NER task by fine-tuning, and compare the effects of the different model architecture and pre-training tasks on the NER task. The experiment results showed that RoBERTa achieved state-of-the-art results on the MSRA-2006 dataset.

READ FULL TEXT
research
05/24/2022

Formulating Few-shot Fine-tuning Towards Language Model Pre-training: A Pilot Study on Named Entity Recognition

Fine-tuning pre-trained language models has recently become a common pra...
research
03/19/2020

Beheshti-NER: Persian Named Entity Recognition Using BERT

Named entity recognition is a natural language processing task to recogn...
research
10/16/2020

Coarse-to-Fine Pre-training for Named Entity Recognition

More recently, Named Entity Recognition hasachieved great advances aided...
research
11/13/2018

Few-shot Learning for Named Entity Recognition in Medical Text

Deep neural network models have recently achieved state-of-the-art perfo...
research
08/31/2021

TNNT: The Named Entity Recognition Toolkit

Extraction of categorised named entities from text is a complex task giv...
research
07/27/2023

Improving Natural Language Inference in Arabic using Transformer Models and Linguistically Informed Pre-Training

This paper addresses the classification of Arabic text data in the field...
research
05/06/2023

NER-to-MRC: Named-Entity Recognition Completely Solving as Machine Reading Comprehension

Named-entity recognition (NER) detects texts with predefined semantic la...

Please sign up or login with your details

Forgot password? Click here to reset