Exploring Cross-sentence Contexts for Named Entity Recognition with BERT

06/02/2020
by   Jouni Luoma, et al.
0

Named entity recognition (NER) is frequently addressed as a sequence classification task where each input consists of one sentence of text. It is nevertheless clear that useful information for the task can often be found outside of the scope of a single-sentence context. Recently proposed self-attention models such as BERT can both efficiently capture long-distance relationships in input as well as represent inputs consisting of several sentences, creating new opportunitites for approaches that incorporate cross-sentence information in natural language processing tasks. In this paper, we present a systematic study exploring the use of cross-sentence information for NER using BERT models in five languages. We find that adding context in the form of additional sentences to BERT input systematically increases NER performance on all of the tested languages and models. Including multiple sentences in each input also allows us to study the predictions of the same sentences in different contexts. We propose a straightforward method, Contextual Majority Voting (CMV), to combine different predictions for sentences and demonstrate this to further increase NER performance with BERT. Our approach does not require any changes to the underlying BERT architecture, rather relying on restructuring examples for training and prediction. Evaluation on established datasets, including the CoNLL'02 and CoNLL'03 NER benchmarks, demonstrates that our proposed approach can improve on the state-of-the-art NER results on English, Dutch, and Finnish, achieves the best reported BERT-based results on German, and is on par with performance reported with other BERT-based approaches in Spanish. We release all methods implemented in this work under open licenses.

READ FULL TEXT
research
07/17/2017

Neural Reranking for Named Entity Recognition

We propose a neural reranking system for named entity recognition (NER)....
research
08/09/2022

Effects of Annotations' Density on Named Entity Recognition Models' Performance in the Context of African Languages

African languages have recently been the subject of several studies in N...
research
11/24/2020

Enhancing deep neural networks with morphological information

Currently, deep learning approaches are superior in natural language pro...
research
12/19/2022

Enriching Relation Extraction with OpenIE

Relation extraction (RE) is a sub-discipline of information extraction (...
research
10/07/2019

Why Attention? Analyzing and Remedying BiLSTM Deficiency in Modeling Cross-Context for NER

State-of-the-art approaches of NER have used sequence-labeling BiLSTM as...
research
01/02/2021

Lex-BERT: Enhancing BERT based NER with lexicons

In this work, we represent Lex-BERT, which incorporates the lexicon info...
research
04/11/2023

Exploring the Use of Foundation Models for Named Entity Recognition and Lemmatization Tasks in Slavic Languages

This paper describes Adam Mickiewicz University's (AMU) solution for the...

Please sign up or login with your details

Forgot password? Click here to reset