MulZDG: Multilingual Code-Switching Framework for Zero-shot Dialogue Generation

08/18/2022
by   Yongkang Liu, et al.
0

Building dialogue generation systems in a zero-shot scenario remains a huge challenge, since the typical zero-shot approaches in dialogue generation rely heavily on large-scale pre-trained language generation models such as GPT-3 and T5. The research on zero-shot dialogue generation without cumbersome language models is limited due to lacking corresponding parallel dialogue corpora. In this paper, we propose a simple but effective Multilingual learning framework for Zero-shot Dialogue Generation (dubbed as MulZDG) that can effectively transfer knowledge from an English corpus with large-scale training samples to a non-English corpus with zero samples. Besides, MulZDG can be viewed as a multilingual data augmentation method to improve the performance of the resource-rich language. First, we construct multilingual code-switching dialogue datasets via translation utterances randomly selected from monolingual English datasets. Then we employ MulZDG to train a unified multilingual dialogue model based on the code-switching datasets. The MulZDG can conduct implicit semantic alignment between different languages. Experiments on DailyDialog and DSTC7 datasets demonstrate that MulZDG not only achieve competitive performance under zero-shot case compared to training with sufficient examples but also greatly improve the performance of the source language.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/27/2022

MDIA: A Benchmark for Multilingual Dialogue Generation in 46 Languages

Owing to the lack of corpora for low-resource languages, current works o...
research
08/02/2022

Multilingual Coreference Resolution in Multiparty Dialogue

Existing multiparty dialogue datasets for coreference resolution are nas...
research
12/10/2020

Multilingual Transfer Learning for QA Using Translation as Data Augmentation

Prior work on multilingual question answering has mostly focused on usin...
research
03/19/2022

Learning-by-Narrating: Narrative Pre-Training for Zero-Shot Dialogue Comprehension

Comprehending a dialogue requires a model to capture diverse kinds of ke...
research
05/23/2023

Multilingual Large Language Models Are Not (Yet) Code-Switchers

Multilingual Large Language Models (LLMs) have recently shown great capa...
research
06/02/2023

ChatGPT for Zero-shot Dialogue State Tracking: A Solution or an Opportunity?

Recent research on dialogue state tracking (DST) focuses on methods that...
research
12/31/2020

A Closer Look at Few-Shot Crosslingual Transfer: Variance, Benchmarks and Baselines

We present a focused study of few-shot crosslingual transfer, a recently...

Please sign up or login with your details

Forgot password? Click here to reset