Enhancing Neural Sequence Labeling with Position-Aware Self-Attention

08/24/2019
by   Wei Wei, et al.
0

Sequence labeling is a fundamental task in natural language processing and has been widely studied. Recently, RNN-based sequence labeling models have increasingly gained attentions. Despite superior performance achieved by learning the long short-term (i.e., successive) dependencies, the way of sequentially processing inputs might limit the ability to capture the non-continuous relations over tokens within a sentence. To tackle the problem, we focus on how to effectively model successive and discrete dependencies of each token for enhancing the sequence labeling performance. Specifically, we propose an innovative and well-designed attention-based model (called position-aware self-attention, i.e., PSA) within a neural network architecture, to explore the positional information of an input sequence for capturing the latent relations among tokens. Extensive experiments on three classical tasks in sequence labeling domain, i.e., part-of-speech (POS) tagging, named entity recognition (NER) and phrase chunking, demonstrate our proposed model outperforms the state-of-the-arts without any external knowledge, in terms of various metrics.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/30/2020

Self-attention-based BiGRU and capsule network for named entity recognition

Named entity recognition(NER) is one of the tasks of natural language pr...
research
11/13/2020

A Survey on Recent Advances in Sequence Labeling from Deep Learning Models

Sequence labeling (SL) is a fundamental research problem encompassing a ...
research
01/31/2018

Reinforced Self-Attention Network: a Hybrid of Hard and Soft Attention for Sequence Modeling

Many natural language processing tasks solely rely on sparse dependencie...
research
12/05/2017

Deep Semantic Role Labeling with Self-Attention

Semantic Role Labeling (SRL) is believed to be a crucial step towards na...
research
02/17/2023

Uncertainty-aware Self-training for Low-resource Neural Sequence Labeling

Neural sequence labeling (NSL) aims at assigning labels for input langua...
research
08/02/2017

Low-Rank Hidden State Embeddings for Viterbi Sequence Labeling

In textual information extraction and other sequence labeling tasks it i...
research
07/19/2023

Integrating a Heterogeneous Graph with Entity-aware Self-attention using Relative Position Labels for Reading Comprehension Model

Despite the significant progress made by transformer models in machine r...

Please sign up or login with your details

Forgot password? Click here to reset