NEXUS Network: Connecting the Preceding and the Following in Dialogue Generation

09/27/2018
by   Hui Su, et al.
0

Sequence-to-Sequence (seq2seq) models have become overwhelmingly popular in building end-to-end trainable dialogue systems. Though highly efficient in learning the backbone of human-computer communications, they suffer from the problem of strongly favoring short generic responses. In this paper, we argue that a good response should smoothly connect both the preceding dialogue history and the following conversations. We strengthen this connection through mutual information maximization. To sidestep the non-differentiability of discrete natural language tokens, we introduce an auxiliary continuous code space and map such code space to a learnable prior distribution for generation purpose. Experiments on two dialogue datasets validate the effectiveness of our model, where the generated responses are closely related to the dialogue context and lead to more interactive conversations.

READ FULL TEXT
research
04/03/2023

Dialog-to-Actions: Building Task-Oriented Dialogue System via Action-Level Generation

End-to-end generation-based approaches have been investigated and applie...
research
05/31/2021

Learning from Perturbations: Diverse and Informative Dialogue Generation with Inverse Adversarial Training

In this paper, we propose Inverse Adversarial Training (IAT) algorithm f...
research
08/25/2016

A Context-aware Natural Language Generator for Dialogue Systems

We present a novel natural language generation system for spoken dialogu...
research
08/24/2020

End to End Dialogue Transformer

Dialogue systems attempt to facilitate conversations between humans and ...
research
06/02/2016

Multiresolution Recurrent Neural Networks: An Application to Dialogue Response Generation

We introduce the multiresolution recurrent neural network, which extends...
research
04/06/2020

Data Manipulation: Towards Effective Instance Learning for Neural Dialogue Generation via Learning to Augment and Reweight

Current state-of-the-art neural dialogue models learn from human convers...
research
08/26/2021

Just Say No: Analyzing the Stance of Neural Dialogue Generation in Offensive Contexts

Dialogue models trained on human conversations inadvertently learn to ge...

Please sign up or login with your details

Forgot password? Click here to reset