longhorns at DADC 2022: How many linguists does it take to fool a Question Answering model? A systematic approach to adversarial attacks

06/29/2022
by   Venelin Kovatchev, et al.
0

Developing methods to adversarially challenge NLP systems is a promising avenue for improving both model performance and interpretability. Here, we describe the approach of the team "longhorns" on Task 1 of the The First Workshop on Dynamic Adversarial Data Collection (DADC), which asked teams to manually fool a model on an Extractive Question Answering task. Our team finished first, with a model error rate of 62 linguistically informed approach to formulating adversarial questions, and we describe the results of our pilot experiments, as well as our official submission.

READ FULL TEXT

page 6

page 7

research
04/18/2021

Improving Question Answering Model Robustness with Synthetic Adversarial Data Generation

Despite the availability of very large datasets and pretrained models, s...
research
08/20/2020

Document Visual Question Answering Challenge 2020

This paper presents results of Document Visual Question Answering Challe...
research
12/16/2021

Models in the Loop: Aiding Crowdworkers with Generative Annotation Assistants

In Dynamic Adversarial Data Collection (DADC), human annotators are task...
research
05/14/2018

Did the Model Understand the Question?

We analyze state-of-the-art deep learning models for three tasks: questi...
research
09/07/2018

Trick Me If You Can: Adversarial Writing of Trivia Challenge Questions

Modern natural language processing systems have been touted as approachi...
research
06/02/2021

On the Efficacy of Adversarial Data Collection for Question Answering: Results from a Large-Scale Randomized Study

In adversarial data collection (ADC), a human workforce interacts with a...
research
06/16/2020

Results of the seventh edition of the BioASQ Challenge

The results of the seventh edition of the BioASQ challenge are presented...

Please sign up or login with your details

Forgot password? Click here to reset