Pretraining Methods for Dialog Context Representation Learning

06/02/2019
by   Shikib Mehri, et al.
0

This paper examines various unsupervised pretraining objectives for learning dialog context representations. Two novel methods of pretraining dialog context encoders are proposed, and a total of four methods are examined. Each pretraining objective is fine-tuned and evaluated on a set of downstream dialog tasks using the MultiWoz dataset and strong performance improvement is observed. Further evaluation shows that our pretraining objectives result in not only better performance, but also better convergence, models that are less data hungry and have better domain generalizability.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/27/2020

Masking Orchestration: Multi-task Pretraining for Multi-role Dialogue Representation Learning

Multi-role dialogue understanding comprises a wide range of diverse task...
research
12/04/2021

Representation Learning for Conversational Data using Discourse Mutual Information Maximization

Although many pretrained models exist for text or images, there have bee...
research
10/15/2021

DS-TOD: Efficient Domain Specialization for Task Oriented Dialog

Recent work has shown that self-supervised dialog-specific pretraining o...
research
07/07/2022

Revisiting Pretraining Objectives for Tabular Deep Learning

Recent deep learning models for tabular data currently compete with the ...
research
09/23/2020

Improving Dialog Evaluation with a Multi-reference Adversarial Dataset and Large Scale Pretraining

There is an increasing focus on model-based dialog evaluation metrics su...
research
11/01/2020

SMRT Chatbots: Improving Non-Task-Oriented Dialog with Simulated Multiple Reference Training

Non-task-oriented dialog models suffer from poor quality and non-diverse...
research
10/10/2022

Transformer-based Localization from Embodied Dialog with Large-scale Pre-training

We address the challenging task of Localization via Embodied Dialog (LED...

Please sign up or login with your details

Forgot password? Click here to reset