Context-Aware Sequence-to-Sequence Models for Conversational Systems

05/22/2018
by   Silje Christensen, et al.
2

This work proposes a novel approach based on sequence-to-sequence (seq2seq) models for context-aware conversational systems. Exist- ing seq2seq models have been shown to be good for generating natural responses in a data-driven conversational system. However, they still lack mechanisms to incorporate previous conversation turns. We investigate RNN-based methods that efficiently integrate previous turns as a context for generating responses. Overall, our experimental results based on human judgment demonstrate the feasibility and effectiveness of the proposed approach.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/02/2018

Augmenting Neural Response Generation with Context-Aware Topical Attention

Sequence-to-Sequence (Seq2Seq) models have witnessed a notable success i...
research
08/25/2016

A Context-aware Natural Language Generator for Dialogue Systems

We present a novel natural language generation system for spoken dialogu...
research
01/18/2020

Adaptive Parameterization for Neural Dialogue Generation

Neural conversation systems generate responses based on the sequence-to-...
research
05/21/2020

Conversational End-to-End TTS for Voice Agent

End-to-end neural TTS has achieved superior performance on reading style...
research
08/26/2023

How Can Context Help? Exploring Joint Retrieval of Passage and Personalized Context

The integration of external personalized context information into docume...
research
06/16/2019

Improving Background Based Conversation with Context-aware Knowledge Pre-selection

Background Based Conversations (BBCs) have been developed to make dialog...
research
09/15/2018

Hierarchical Graphical Models for Context-Aware Hybrid Brain-Machine Interfaces

We present a novel hierarchical graphical model based context-aware hybr...

Please sign up or login with your details

Forgot password? Click here to reset