Self-Attention Gazetteer Embeddings for Named-Entity Recognition

04/08/2020
by   Stanislav Peshterliev, et al.
0

Recent attempts to ingest external knowledge into neural models for named-entity recognition (NER) have exhibited mixed results. In this work, we present GazSelfAttn, a novel gazetteer embedding approach that uses self-attention and match span encoding to build enhanced gazetteer embeddings. In addition, we demonstrate how to build gazetteer resources from the open source Wikidata knowledge base. Evaluations on CoNLL-03 and Ontonotes 5 datasets, show F1 improvements over baseline model from 92.34 to 92.86 and 89.11 to 89.32 respectively, achieving performance comparable to large state-of-the-art models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/03/2019

CAN-NER: Convolutional Attention Network forChinese Named Entity Recognition

Named entity recognition (NER) in Chinese is essential but difficult bec...
research
01/30/2020

Self-attention-based BiGRU and capsule network for named entity recognition

Named entity recognition(NER) is one of the tasks of natural language pr...
research
02/18/2022

DataMUX: Data Multiplexing for Neural Networks

In this paper, we introduce data multiplexing (DataMUX), a technique tha...
research
10/09/2020

Constrained Decoding for Computationally Efficient Named Entity Recognition Taggers

Current state-of-the-art models for named entity recognition (NER) are n...
research
04/10/2021

Not All Attention Is All You Need

Self-attention based models have achieved remarkable success in natural ...
research
12/19/2022

Do CoNLL-2003 Named Entity Taggers Still Work Well in 2023?

Named Entity Recognition (NER) is an important and well-studied task in ...

Please sign up or login with your details

Forgot password? Click here to reset