Hierarchical Contextualized Representation for Named Entity Recognition

11/06/2019
by   Ying Luo, et al.
0

Current named entity recognition (NER) models are typically based on the architecture of Bi-directional LSTM (BiLSTM). The constraints of sequential nature and the modeling of single input prevent the full utilization of global information, not only in the entire sentence, but also in the entire document (dataset). In this paper, we try to address these two deficiencies and propose a model augmented with hierarchical contextualized representation: sentence-level representation and document-level representation. In sentence-level modeling, we take different contributions of words in a single sentence into consideration to enhance the sentence representation learned from an independent BiLSTM via label embedding attention mechanism. Furthermore, the key-value memory network is adopted to record the global information for each unique word to generate the document-level representation which is sensitive to similarity of context information. Our two-level hierarchical contextualized representations are fused with each input token embedding and corresponding hidden state of BiLSTM, respectively. The experimental results on three benchmark NER datasets (CoNLL-2003 and Ontonotes 5.0 English datasets, CoNLL-2002 Spanish dataset) show that we establish new state-of-the-art results on these benchmarks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/02/2021

Exploiting Global Contextual Information for Document-level Named Entity Recognition

Most existing named entity recognition (NER) approaches are based on seq...
research
06/06/2019

GCDT: A Global Context Enhanced Deep Transition Architecture for Sequence Labeling

Current state-of-the-art systems for sequence labeling are typically bas...
research
05/08/2021

Improving Named Entity Recognition by External Context Retrieving and Cooperative Learning

Recent advances in Named Entity Recognition (NER) show that document-lev...
research
03/23/2020

E2EET: From Pipeline to End-to-end Entity Typing via Transformer-Based Embeddings

Entity Typing (ET) is the process of identifying the semantic types of e...
research
05/18/2022

A reproducible experimental survey on biomedical sentence similarity: a string-based method sets the state of the art

This registered report introduces the largest, and for the first time, r...
research
05/04/2023

The Role of Global and Local Context in Named Entity Recognition

Pre-trained transformer-based models have recently shown great performan...
research
05/31/2023

A Global Context Mechanism for Sequence Labeling

Sequential labeling tasks necessitate the computation of sentence repres...

Please sign up or login with your details

Forgot password? Click here to reset