Neural Contextual Conversation Learning with Labeled Question-Answering Pairs

07/20/2016
by   Kun Xiong, et al.
0

Neural conversational models tend to produce generic or safe responses in different contexts, e.g., reply "Of course" to narrative statements or "I don't know" to questions. In this paper, we propose an end-to-end approach to avoid such problem in neural generative models. Additional memory mechanisms have been introduced to standard sequence-to-sequence (seq2seq) models, so that context can be considered while generating sentences. Three seq2seq models, which memorize a fix-sized contextual vector from hidden input, hidden input/output and a gated contextual attention structure respectively, have been trained and tested on a dataset of labeled question-answering pairs in Chinese. The model with contextual attention outperforms others including the state-of-the-art seq2seq models on perplexity test. The novel contextual model generates diverse and robust responses, and is able to carry out conversations on a wide range of topics appropriately.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/07/2022

CoQAR: Question Rewriting on CoQA

Questions asked by humans during a conversation often contain contextual...
research
12/22/2016

A Context-aware Attention Network for Interactive Question Answering

Neural network based sequence-to-sequence models in an encoder-decoder f...
research
12/10/2018

SDNet: Contextualized Attention-based Deep Network for Conversational Question Answering

Conversational question answering (CQA) is a novel QA task that requires...
research
03/13/2021

ParaQA: A Question Answering Dataset with Paraphrase Responses for Single-Turn Conversation

This paper presents ParaQA, a question answering (QA) dataset with multi...
research
03/23/2017

Recurrent and Contextual Models for Visual Question Answering

We propose a series of recurrent and contextual neural network models fo...
research
06/22/2015

A Neural Network Approach to Context-Sensitive Generation of Conversational Responses

We present a novel response generation system that can be trained end to...
research
11/17/2018

An Affect-Rich Neural Conversational Model with Biased Attention and Weighted Cross-Entropy Loss

Affect conveys important implicit information in human communication. Ha...

Please sign up or login with your details

Forgot password? Click here to reset