A Global Context Mechanism for Sequence Labeling

05/31/2023
by   Conglei Xu, et al.
0

Sequential labeling tasks necessitate the computation of sentence representations for each word within a given sentence. With the advent of advanced pretrained language models; one common approach involves incorporating a BiLSTM layer to bolster the sequence structure information at the output level. Nevertheless, it has been empirically demonstrated (P.-H. Li et al., 2020) that the potential of BiLSTM for generating sentence representations for sequence labeling tasks is constrained, primarily due to the amalgamation of fragments form past and future sentence representations to form a complete sentence representation. In this study, we discovered that strategically integrating the whole sentence representation, which existing in the first cell and last cell of BiLSTM, into sentence representation of ecah cell, could markedly enhance the F1 score and accuracy. Using BERT embedded within BiLSTM as illustration, we conducted exhaustive experiments on nine datasets for sequence labeling tasks, encompassing named entity recognition (NER), part of speech (POS) tagging and End-to-End Aspect-Based sentiment analysis (E2E-ABSA). We noted significant improvements in F1 scores and accuracy across all examined datasets .

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/02/2021

Exploiting Global Contextual Information for Document-level Named Entity Recognition

Most existing named entity recognition (NER) approaches are based on seq...
research
03/04/2016

End-to-end Sequence Labeling via Bi-directional LSTM-CNNs-CRF

State-of-the-art sequence labeling systems traditionally require large a...
research
06/06/2019

GCDT: A Global Context Enhanced Deep Transition Architecture for Sequence Labeling

Current state-of-the-art systems for sequence labeling are typically bas...
research
11/06/2019

Hierarchical Contextualized Representation for Named Entity Recognition

Current named entity recognition (NER) models are typically based on the...
research
09/28/2017

Jointly Trained Sequential Labeling and Classification by Sparse Attention Neural Networks

Sentence-level classification and sequential labeling are two fundamenta...
research
05/29/2019

Learning Task-specific Representation for Novel Words in Sequence Labeling

Word representation is a key component in neural-network-based sequence ...
research
12/10/2020

Segmenting Natural Language Sentences via Lexical Unit Analysis

In this work, we present Lexical Unit Analysis (LUA), a framework for ge...

Please sign up or login with your details

Forgot password? Click here to reset