Context-aware Adversarial Attack on Named Entity Recognition

09/16/2023
by   Shuguang Chen, et al.
0

In recent years, large pre-trained language models (PLMs) have achieved remarkable performance on many natural language processing benchmarks. Despite their success, prior studies have shown that PLMs are vulnerable to attacks from adversarial examples. In this work, we focus on the named entity recognition task and study context-aware adversarial attack methods to examine the model's robustness. Specifically, we propose perturbing the most informative words for recognizing entities to create adversarial examples and investigate different candidate replacement methods to generate natural and plausible adversarial examples. Experiments and analyses show that our methods are more effective in deceiving the model into making wrong predictions than strong baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/09/2023

Attack Named Entity Recognition by Entity Boundary Interference

Named Entity Recognition (NER) is a cornerstone NLP task while its robus...
research
05/08/2023

Toward Adversarial Training on Contextualized Language Representation

Beyond the success story of adversarial training (AT) in the recent text...
research
04/23/2020

On Adversarial Examples for Biomedical NLP Tasks

The success of pre-trained word embeddings has motivated its use in task...
research
12/24/2020

A Context Aware Approach for Generating Natural Language Attacks

We study an important task of attacking natural language processing mode...
research
09/16/2020

Contextualized Perturbation for Textual Adversarial Attack

Adversarial examples expose the vulnerabilities of natural language proc...
research
06/29/2020

Natural Backdoor Attack on Text Data

Deep learning has been widely adopted in natural language processing app...
research
07/24/2021

Stress Test Evaluation of Biomedical Word Embeddings

The success of pretrained word embeddings has motivated their use in the...

Please sign up or login with your details

Forgot password? Click here to reset