DiSTRICT: Dialogue State Tracking with Retriever Driven In-Context Tuning

12/06/2022
by   Praveen Venkateswaran, et al.
0

Dialogue State Tracking (DST), a key component of task-oriented conversation systems, represents user intentions by determining the values of pre-defined slots in an ongoing dialogue. Existing approaches use hand-crafted templates and additional slot information to fine-tune and prompt large pre-trained language models and elicit slot values from the dialogue context. Significant manual effort and domain knowledge is required to design effective prompts, limiting the generalizability of these approaches to new domains and tasks. In this work, we propose DiSTRICT, a generalizable in-context tuning approach for DST that retrieves highly relevant training examples for a given dialogue to fine-tune the model without any hand-crafted templates. Experiments with the MultiWOZ benchmark datasets show that DiSTRICT outperforms existing approaches in various zero-shot and few-shot settings using a much smaller model, thereby providing an important advantage for real-world deployments that often have limited resource availability.

READ FULL TEXT

page 13

page 14

research
05/10/2021

Leveraging Slot Descriptions for Zero-Shot Cross-Domain Dialogue State Tracking

Zero-shot cross-domain dialogue state tracking (DST) enables us to handl...
research
11/22/2022

AutoReply: Detecting Nonsense in Dialogue Introspectively with Discriminative Replies

Existing approaches built separate classifiers to detect nonsense in dia...
research
02/25/2023

Choice Fusion as Knowledge for Zero-Shot Dialogue State Tracking

With the demanding need for deploying dialogue systems in new domains wi...
research
10/22/2018

Towards Universal Dialogue State Tracking

Dialogue state tracking is the core part of a spoken dialogue system. It...
research
01/26/2023

Parameter-Efficient Low-Resource Dialogue State Tracking by Prompt Tuning

Dialogue state tracking (DST) is an important step in dialogue managemen...
research
05/20/2022

Prototypical Calibration for Few-shot Learning of Language Models

In-context learning of GPT-like models has been recognized as fragile ac...
research
05/21/2021

Towards a Universal NLG for Dialogue Systems and Simulators with Future Bridging

In a dialogue system pipeline, a natural language generation (NLG) unit ...

Please sign up or login with your details

Forgot password? Click here to reset