Context-aware Adversarial Training for Name Regularity Bias in Named Entity Recognition

07/24/2021
by   Abbas Ghaddar, et al.
0

In this work, we examine the ability of NER models to use contextual information when predicting the type of an ambiguous entity. We introduce NRB, a new testbed carefully designed to diagnose Name Regularity Bias of NER models. Our results indicate that all state-of-the-art models we tested show such a bias; BERT fine-tuned models significantly outperforming feature-based (LSTM-CRF) ones on NRB, despite having comparable (sometimes lower) performance on standard benchmarks. To mitigate this bias, we propose a novel model-agnostic training method that adds learnable adversarial noise to some entity mentions, thus enforcing models to focus more strongly on the contextual signal, leading to significant gains on NRB. Combining it with two other training strategies, data augmentation and parameter freezing, leads to further gains.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/30/2021

KARL-Trans-NER: Knowledge Aware Representation Learning for Named Entity Recognition using Transformers

The inception of modeling contextual information using models such as BE...
research
12/15/2019

Robust Named Entity Recognition with Truecasing Pretraining

Although modern named entity recognition (NER) systems show impressive p...
research
06/23/2017

Named Entity Recognition with stack residual LSTM and trainable bias decoding

Recurrent Neural Network models are the state-of-the-art for Named Entit...
research
01/11/2017

Generalisation in Named Entity Recognition: A Quantitative Analysis

Named Entity Recognition (NER) is a key NLP task, which is all the more ...
research
04/12/2022

Delving Deep into Regularity: A Simple but Effective Method for Chinese Named Entity Recognition

Recent years have witnessed the improving performance of Chinese Named E...
research
04/25/2020

A Rigourous Study on Named Entity Recognition: Can Fine-tuning Pretrained Model Lead to the Promised Land?

Fine-tuning pretrained model has achieved promising performance on stand...

Please sign up or login with your details

Forgot password? Click here to reset