Did the Model Understand the Question?

We analyze state-of-the-art deep learning models for three tasks: question answering on (1) images, (2) tables, and (3) passages of text. Using the notion of attribution (word importance), we find that these deep networks often ignore important question terms. Leveraging such behavior, we perturb questions to craft a variety of adversarial examples. Our strongest attacks drop the accuracy of a visual question answering model from 61.1% to 19%, and that of a tabular question answering model from 33.5% to 3.3%. Additionally, we show how attributions can strengthen attacks proposed by Jia and Liang (2017) on paragraph comprehension models. Our results demonstrate that attributions can augment standard measures of accuracy and empower investigation of model performance. When a model is accurate but for the wrong reasons, attributions can surface erroneous logic in the model that indicates inadequacies in the test data.

READ FULL TEXT
research
03/04/2016

Dynamic Memory Networks for Visual and Textual Question Answering

Neural network architectures with memory and attention mechanisms exhibi...
research
12/05/2018

Are you tough enough? Framework for Robustness Validation of Machine Comprehension Systems

Deep Learning NLP domain lacks procedures for the analysis of model robu...
research
12/31/2018

The meaning of "most" for visual question answering models

The correct interpretation of quantifier statements in the context of a ...
research
01/14/2021

TSQA: Tabular Scenario Based Question Answering

Scenario-based question answering (SQA) has attracted an increasing rese...
research
10/31/2017

DCN+: Mixed Objective and Deep Residual Coattention for Question Answering

Traditional models for question answering optimize using cross entropy l...
research
08/17/2019

U-CAM: Visual Explanation using Uncertainty based Class Activation Maps

Understanding and explaining deep learning models is an imperative task....

Please sign up or login with your details

Forgot password? Click here to reset