Data-Efficient Methods for Dialogue Systems

12/05/2020
by   Igor Shalyminov, et al.
11

Conversational User Interface (CUI) has become ubiquitous in everyday life, in consumer-focused products like Siri and Alexa or business-oriented solutions. Deep learning underlies many recent breakthroughs in dialogue systems but requires very large amounts of training data, often annotated by experts. Trained with smaller data, these methods end up severely lacking robustness (e.g. to disfluencies and out-of-domain input), and often just have too little generalisation power. In this thesis, we address the above issues by introducing a series of methods for training robust dialogue systems from minimal data. Firstly, we study two orthogonal approaches to dialogue: linguistically informed and machine learning-based - from the data efficiency perspective. We outline the steps to obtain data-efficient solutions with either approach. We then introduce two data-efficient models for dialogue response generation: the Dialogue Knowledge Transfer Network based on latent variable dialogue representations, and the hybrid Generative-Retrieval Transformer model (ranked first at the DSTC 8 Fast Domain Adaptation task). Next, we address the problem of robustness given minimal data. As such, propose a multitask LSTM-based model for domain-general disfluency detection. For the problem of out-of-domain input, we present Turn Dropout, a data augmentation technique for anomaly detection only using in-domain data, and introduce autoencoder-augmented models for efficient training with Turn Dropout. Finally, we focus on social dialogue and introduce a neural model for response ranking in social conversation used in Alana, the 3rd place winner in the Amazon Alexa Prize 2017 and 2018. We employ a novel technique of predicting the dialogue length as the main ranking objective and show that this approach improves upon the ratings-based counterpart in terms of data efficiency while matching it in performance.

READ FULL TEXT

page 5

page 18

page 30

page 32

page 33

page 36

page 38

page 40

research
03/03/2020

Hybrid Generative-Retrieval Transformers for Dialogue Domain Adaptation

Domain adaptation has recently become a key problem in dialogue systems ...
research
10/03/2019

Data-Efficient Goal-Oriented Conversation with Dialogue Knowledge Transfer Networks

Goal-oriented dialogue systems are now being widely adopted in industry ...
research
09/25/2020

MinTL: Minimalist Transfer Learning for Task-Oriented Dialogue Systems

In this paper, we propose Minimalist Transfer Learning (MinTL) to simpli...
research
04/15/2022

DialAug: Mixing up Dialogue Contexts in Contrastive Learning for Robust Conversational Modeling

Retrieval-based conversational systems learn to rank response candidates...
research
11/02/2018

Neural Response Ranking for Social Conversation: A Data-Efficient Approach

The overall objective of 'social' dialogue systems is to support engagin...
research
12/20/2017

An Ensemble Model with Ranking for Social Dialogue

Open-domain social dialogue is one of the long-standing goals of Artific...
research
11/29/2018

Improving Robustness of Neural Dialog Systems in a Data-Efficient Way with Turn Dropout

Neural network-based dialog models often lack robustness to anomalous, o...

Please sign up or login with your details

Forgot password? Click here to reset