MinTL: Minimalist Transfer Learning for Task-Oriented Dialogue Systems

by   Zhaojiang Lin, et al.

In this paper, we propose Minimalist Transfer Learning (MinTL) to simplify the system design process of task-oriented dialogue systems and alleviate the over-dependency on annotated data. MinTL is a simple yet effective transfer learning framework, which allows us to plug-and-play pre-trained seq2seq models, and jointly learn dialogue state tracking and dialogue response generation. Unlike previous approaches, which use a copy mechanism to "carryover" the old dialogue states to the new one, we introduce Levenshtein belief spans (Lev), that allows efficient dialogue state tracking with a minimal generation length. We instantiate our learning framework with two pre-trained backbones: T5 and BART, and evaluate them on MultiWOZ. Extensive experiments demonstrate that: 1) our systems establish new state-of-the-art results on end-to-end response generation, 2) MinTL-based systems are more robust than baseline methods in the low resource setting, and they achieve competitive results with only 20% training data, and 3) Lev greatly improves the inference efficiency.


page 1

page 2

page 3

page 4


Multi-Task Pre-Training for Plug-and-Play Task-Oriented Dialogue System

Pre-trained language models have been recently shown to benefit task-ori...

A Simple Language Model for Task-Oriented Dialogue

Task-oriented dialogue is often decomposed into three tasks: understandi...

A Simple But Effective Approach to n-shot Task-Oriented Dialogue Augmentation

The collection and annotation of task-oriented conversational data is a ...

Few-Shot Dialogue Generation Without Annotated Data: A Transfer Learning Approach

Learning with minimal data is one of the key challenges in the developme...

Data-Efficient Methods for Dialogue Systems

Conversational User Interface (CUI) has become ubiquitous in everyday li...

Prompt Learning for Few-Shot Dialogue State Tracking

Collecting dialogue state labels, slots and values, for learning dialogu...

Efficient Task-Oriented Dialogue Systems with Response Selection as an Auxiliary Task

The adoption of pre-trained language models in task-oriented dialogue sy...

Code Repositories


MinTL: Minimalist Transfer Learning for Task-Oriented Dialogue Systems

view repo