Context Matters in Semantically Controlled Language Generation for Task-oriented Dialogue Systems

11/28/2021
by   Ye Liu, et al.
0

This work combines information about the dialogue history encoded by pre-trained model with a meaning representation of the current system utterance to realize contextual language generation in task-oriented dialogues. We utilize the pre-trained multi-context ConveRT model for context representation in a model trained from scratch; and leverage the immediate preceding user utterance for context generation in a model adapted from the pre-trained GPT-2. Both experiments with the MultiWOZ dataset show that contextual information encoded by pre-trained model improves the performance of response generation both in automatic metrics and human evaluation. Our presented contextual generator enables higher variety of generated responses that fit better to the ongoing dialogue. Analysing the context size shows that longer context does not automatically lead to better performance, but the immediate preceding user utterance plays an essential role for contextual generation. In addition, we also propose a re-ranker for the GPT-based generation model. The experiments show that the response selected by the re-ranker has a significant improvement on automatic metrics.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/01/2019

DialoGPT: Large-Scale Generative Pre-training for Conversational Response Generation

We present a large, tunable neural conversational response generation mo...
research
04/29/2020

Modeling Long Context for Task-Oriented Dialogue State Generation

Based on the recently proposed transferable dialogue state generator (TR...
research
08/31/2021

Task-Oriented Dialogue System as Natural Language Generation

In this paper, we propose to formulate the task-oriented dialogue system...
research
06/20/2021

Do Encoder Representations of Generative Dialogue Models Encode Sufficient Information about the Task ?

Predicting the next utterance in dialogue is contingent on encoding of u...
research
05/26/2021

Language Model as an Annotator: Exploring DialoGPT for Dialogue Summarization

Current dialogue summarization systems usually encode the text with a nu...
research
05/15/2022

Long-term Control for Dialogue Generation: Methods and Evaluation

Current approaches for controlling dialogue response generation are prim...
research
05/29/2023

Contextual Knowledge Learning For Dialogue Generation

Incorporating conversational context and knowledge into dialogue generat...

Please sign up or login with your details

Forgot password? Click here to reset