DeepAI AI Chat
Log In Sign Up

Self-attention-based BiGRU and capsule network for named entity recognition

by   Jianfeng Deng, et al.
Guangdong University of Technology
NetEase, Inc

Named entity recognition(NER) is one of the tasks of natural language processing(NLP). In view of the problem that the traditional character representation ability is weak and the neural network method is unable to capture the important sequence information. An self-attention-based bidirectional gated recurrent unit(BiGRU) and capsule network(CapsNet) for NER is proposed. This model generates character vectors through bidirectional encoder representation of transformers(BERT) pre-trained model. BiGRU is used to capture sequence context features, and self-attention mechanism is proposed to give different focus on the information captured by hidden layer of BiGRU. Finally, we propose to use CapsNet for entity recognition. We evaluated the recognition performance of the model on two datasets. Experimental results show that the model has better performance without relying on external dictionary information.


page 1

page 2

page 3

page 4


CAN-NER: Convolutional Attention Network forChinese Named Entity Recognition

Named entity recognition (NER) in Chinese is essential but difficult bec...

Not All Attention Is All You Need

Self-attention based models have achieved remarkable success in natural ...

Enhancing Neural Sequence Labeling with Position-Aware Self-Attention

Sequence labeling is a fundamental task in natural language processing a...

DataMUX: Data Multiplexing for Neural Networks

In this paper, we introduce data multiplexing (DataMUX), a technique tha...

Self-Attention Gazetteer Embeddings for Named-Entity Recognition

Recent attempts to ingest external knowledge into neural models for name...

BTPK-based learning: An Interpretable Method for Named Entity Recognition

Named entity recognition (NER) is an essential task in natural language ...