Controllable Generation of Dialogue Acts for Dialogue Systems via Few-Shot Response Generation and Ranking

07/26/2023
by   Angela Ramirez, et al.
0

Dialogue systems need to produce responses that realize multiple types of dialogue acts (DAs) with high semantic fidelity. In the past, natural language generators (NLGs) for dialogue were trained on large parallel corpora that map from a domain-specific DA and its semantic attributes to an output utterance. Recent work shows that pretrained language models (LLMs) offer new possibilities for controllable NLG using prompt-based learning. Here we develop a novel few-shot overgenerate-and-rank approach that achieves the controlled generation of DAs. We compare eight few-shot prompt styles that include a novel method of generating from textual pseudo-references using a textual style transfer approach. We develop six automatic ranking functions that identify outputs with both the correct DA and high semantic accuracy at generation time. We test our approach on three domains and four LLMs. To our knowledge, this is the first work on NLG for dialogue that automatically ranks outputs using both DA and attribute accuracy. For completeness, we compare our results to fine-tuned few-shot models trained with 5 to 100 instances per DA. Our results show that several prompt settings achieve perfect DA accuracy, and near perfect semantic accuracy (99.81

READ FULL TEXT

page 6

page 15

research
02/08/2023

Controlling Personality Style in Dialogue with Zero-Shot Prompt-Based Learning

Prompt-based or in-context learning has achieved high zero-shot performa...
research
10/15/2021

Jurassic is (almost) All You Need: Few-Shot Meaning-to-Text Generation for Open-Domain Dialogue

One challenge with open-domain dialogue systems is the need to produce h...
research
05/04/2023

Semantic Space Grounded Weighted Decoding for Multi-Attribute Controllable Dialogue Generation

Controlling chatbot utterance generation with multiple attributes such a...
research
08/01/2023

DiactTOD: Learning Generalizable Latent Dialogue Acts for Controllable Task-Oriented Dialogue Systems

Dialogue act annotations are important to improve response generation qu...
research
09/02/2019

Modeling Long-Range Context for Concurrent Dialogue Acts Recognition

In dialogues, an utterance is a chain of consecutive sentences produced ...
research
09/30/2020

Learning from Mistakes: Combining Ontologies via Self-Training for Dialogue Generation

Natural language generators (NLGs) for task-oriented dialogue typically ...
research
09/24/2021

Style Control for Schema-Guided Natural Language Generation

Natural Language Generation (NLG) for task-oriented dialogue systems foc...

Please sign up or login with your details

Forgot password? Click here to reset