A Context-aware Attention Network for Interactive Question Answering

12/22/2016
by   Huayu Li, et al.
0

Neural network based sequence-to-sequence models in an encoder-decoder framework have been successfully applied to solve Question Answering (QA) problems, predicting answers from statements and questions. However, almost all previous models have failed to consider detailed context information and unknown states under which systems do not have enough information to answer given questions. These scenarios with incomplete or ambiguous information are very common in the setting of Interactive Question Answering (IQA). To address this challenge, we develop a novel model, employing context-dependent word-level attention for more accurate statement representations and question-guided sentence-level attention for better context modeling. We also generate unique IQA datasets to test our model, which will be made publicly available. Employing these attention mechanisms, our model accurately understands when it can output an answer or when it requires generating a supplementary question for additional input depending on different contexts. When available, user's feedback is encoded and directly applied to update sentence-level attention to infer an answer. Extensive experiments on QA and IQA datasets quantitatively demonstrate the effectiveness of our model with significant improvement over state-of-the-art conventional QA models.

READ FULL TEXT
research
07/20/2018

Question-Aware Sentence Gating Networks for Question and Answering

Machine comprehension question answering, which finds an answer to the q...
research
11/16/2017

An Abstractive approach to Question Answering

Question Answering has come a long way from answer sentence selection, r...
research
04/04/2016

Character-Level Question Answering with Attention

We show that a character-level encoder-decoder framework can be successf...
research
07/20/2016

Neural Contextual Conversation Learning with Labeled Question-Answering Pairs

Neural conversational models tend to produce generic or safe responses i...
research
01/25/2018

A Question-Focused Multi-Factor Attention Network for Question Answering

Neural network models recently proposed for question answering (QA) prim...
research
01/12/2019

Semi-interactive Attention Network for Answer Understanding in Reverse-QA

Question answering (QA) is an important natural language processing (NLP...
research
11/25/2019

Conclusion-Supplement Answer Generation for Non-Factoid Questions

This paper tackles the goal of conclusion-supplement answer generation f...

Please sign up or login with your details

Forgot password? Click here to reset