DisentQA: Disentangling Parametric and Contextual Knowledge with Counterfactual Question Answering

11/10/2022
by   Ella Neeman, et al.
0

Question answering models commonly have access to two sources of "knowledge" during inference time: (1) parametric knowledge - the factual knowledge encoded in the model weights, and (2) contextual knowledge - external knowledge (e.g., a Wikipedia passage) given to the model to generate a grounded answer. Having these two sources of knowledge entangled together is a core issue for generative QA models as it is unclear whether the answer stems from the given non-parametric knowledge or not. This unclarity has implications on issues of trust, interpretability and factuality. In this work, we propose a new paradigm in which QA models are trained to disentangle the two sources of knowledge. Using counterfactual data augmentation, we introduce a model that predicts two answers for a given question: one based on given contextual knowledge and one based on parametric knowledge. Our experiments on the Natural Questions dataset show that this approach improves the performance of QA models by making them more robust to knowledge conflicts between the two knowledge sources, while generating useful disentangled answers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/06/2019

Incorporating External Knowledge into Machine Reading for Generative Question Answering

Commonsense and background knowledge is required for a QA model to answe...
research
10/25/2022

Rich Knowledge Sources Bring Complex Knowledge Conflicts: Recalibrating Models to Reflect Conflicting Evidence

Question answering models can use rich knowledge sources – up to one hun...
research
05/23/2022

StreamingQA: A Benchmark for Adaptation to New Knowledge over Time in Question Answering Models

Knowledge and language understanding of models evaluated through questio...
research
09/10/2021

Entity-Based Knowledge Conflicts in Question Answering

Knowledge-dependent tasks typically use two sources of knowledge: parame...
research
04/11/2023

chatIPCC: Grounding Conversational AI in Climate Science

Large Language Models (LLMs) have made significant progress in recent ye...
research
09/20/2023

Localize, Retrieve and Fuse: A Generalized Framework for Free-Form Question Answering over Tables

Question answering on tabular data (a.k.a TableQA), which aims at genera...
research
09/08/2017

Globally Normalized Reader

Rapid progress has been made towards question answering (QA) systems tha...

Please sign up or login with your details

Forgot password? Click here to reset