Attention-Informed Mixed-Language Training for Zero-shot Cross-lingual Task-oriented Dialogue Systems

11/21/2019
by   Zihan Liu, et al.
0

Recently, data-driven task-oriented dialogue systems have achieved promising performance in English. However, developing dialogue systems that support low-resource languages remains a long-standing challenge due to the absence of high-quality data. In order to circumvent the expensive and time-consuming data collection, we introduce Attention-Informed Mixed-Language Training (MLT), a novel zero-shot adaptation method for cross-lingual task-oriented dialogue systems. It leverages very few task-related parallel word pairs to generate code-switching sentences for learning the inter-lingual semantics across languages. Instead of manually selecting the word pairs, we propose to extract source words based on the scores computed by the attention layer of a trained English task-related model and then generate word pairs using existing bilingual dictionaries. Furthermore, intensive experiments with different cross-lingual embeddings demonstrate the effectiveness of our approach. Finally, with very few word pairs, our model achieves significant zero-shot adaptation performance improvements in both cross-lingual dialogue state tracking and natural language understanding (i.e., intent detection and slot filling) tasks compared to the current state-of-the-art approaches, which utilize a much larger amount of bilingual data.

READ FULL TEXT
research
11/11/2019

Zero-shot Cross-lingual Dialogue Systems with Transferable Latent Variables

Despite the surging demands for multilingual task-oriented dialog system...
research
02/18/2023

Zero and Few-Shot Localization of Task-Oriented Dialogue Agents with a Distilled Representation

Task-oriented Dialogue (ToD) agents are mostly limited to a few widely-s...
research
08/05/2023

LaDA: Latent Dialogue Action For Zero-shot Cross-lingual Neural Network Language Modeling

Cross-lingual adaptation has proven effective in spoken language underst...
research
03/18/2022

CrossAligner Co: Zero-Shot Transfer Methods for Task-Oriented Cross-lingual Natural Language Understanding

Task-oriented personal assistants enable people to interact with a host ...
research
04/12/2022

XQA-DST: Multi-Domain and Multi-Lingual Dialogue State Tracking

In a task-oriented dialogue system, Dialogue State Tracking (DST) keeps ...
research
06/28/2021

Efficient Dialogue State Tracking by Masked Hierarchical Transformer

This paper describes our approach to DSTC 9 Track 2: Cross-lingual Multi...
research
05/07/2022

Multi-level Contrastive Learning for Cross-lingual Spoken Language Understanding

Although spoken language understanding (SLU) has achieved great success ...

Please sign up or login with your details

Forgot password? Click here to reset