Generating Dialogue Responses from a Semantic Latent Space

10/04/2020
by   Wei-Jen Ko, et al.
0

Existing open-domain dialogue generation models are usually trained to mimic the gold response in the training set using cross-entropy loss on the vocabulary. However, a good response does not need to resemble the gold response, since there are multiple possible responses to a given prompt. In this work, we hypothesize that the current models are unable to integrate information from multiple semantically similar valid responses of a prompt, resulting in the generation of generic and uninformative responses. To address this issue, we propose an alternative to the end-to-end classification on vocabulary. We learn the pair relationship between the prompts and responses as a regression task on a latent space instead. In our novel dialog generation model, the representations of semantically related sentences are close to each other on the latent space. Human evaluation showed that learning the task on a continuous space can generate responses that are both relevant and informative.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/13/2021

Understanding and Improving the Exemplar-based Generation for Open-domain Conversation

Exemplar-based generative models for open-domain conversation produce re...
research
05/21/2022

CORAL: Contextual Response Retrievability Loss Function for Training Dialog Generation Models

Natural Language Generation (NLG) represents a large collection of tasks...
research
05/26/2023

Evaluating Open-Domain Dialogues in Latent Space with Next Sentence Prediction and Mutual Information

The long-standing one-to-many issue of the open-domain dialogues poses s...
research
06/20/2021

A Brief Study on the Effects of Training Generative Dialogue Models with a Semantic loss

Neural models trained for next utterance generation in dialogue task lea...
research
10/10/2020

Cue-word Driven Neural Response Generation with a Shrinking Vocabulary

Open-domain response generation is the task of generating sensible and i...
research
11/19/2019

Aging Memories Generate More Fluent Dialogue Responses with Memory Networks

The integration of a Knowledge Base (KB) into a neural dialogue agent is...

Please sign up or login with your details

Forgot password? Click here to reset