BERT-CoQAC: BERT-based Conversational Question Answering in Context

04/23/2021
by   Munazza Zaib, et al.
0

As one promising way to inquire about any particular information through a dialog with the bot, question answering dialog systems have gained increasing research interests recently. Designing interactive QA systems has always been a challenging task in natural language processing and used as a benchmark to evaluate a machine's ability of natural language understanding. However, such systems often struggle when the question answering is carried out in multiple turns by the users to seek more information based on what they have already learned, thus, giving rise to another complicated form called Conversational Question Answering (CQA). CQA systems are often criticized for not understanding or utilizing the previous context of the conversation when answering the questions. To address the research gap, in this paper, we explore how to integrate conversational history into the neural machine comprehension system. On one hand, we introduce a framework based on a publically available pre-trained language model called BERT for incorporating history turns into the system. On the other hand, we propose a history selection mechanism that selects the turns that are relevant and contributes the most to answer the current question. Experimentation results revealed that our framework is comparable in performance with the state-of-the-art models on the QuAC leader board. We also conduct a number of experiments to show the side effects of using entire context information which brings unnecessary information and noise signals resulting in a decline in the model's performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/26/2019

Attentive History Selection for Conversational Question Answering

Conversational question answering (ConvQA) is a simplified but concrete ...
research
06/09/2021

What Would a Teacher Do? Predicting Future Talk Moves

Recent advances in natural language processing (NLP) have the ability to...
research
05/30/2019

A Simple but Effective Method to Incorporate Multi-turn Context with BERT for Conversational Machine Comprehension

Conversational machine comprehension (CMC) requires understanding the co...
research
09/24/2019

An Empirical Study of Content Understanding in Conversational Question Answering

With a lot of work about context-free question answering systems, there ...
research
08/22/2020

FAT ALBERT: Finding Answers in Large Texts using Semantic Similarity Attention Layer based on BERT

Machine based text comprehension has always been a significant research ...
research
10/30/2019

Time to Take Emoji Seriously: They Vastly Improve Casual Conversational Models

Graphical emoji are ubiquitous in modern-day online conversations. So is...
research
03/19/2019

Natural Language Generation at Scale: A Case Study for Open Domain Question Answering

Current approaches to Natural Language Generation (NLG) focus on domain-...

Please sign up or login with your details

Forgot password? Click here to reset