Log In Sign Up

From Masked Language Modeling to Translation: Non-English Auxiliary Tasks Improve Zero-shot Spoken Language Understanding

by   Rob van der Goot, et al.

The lack of publicly available evaluation data for low-resource languages limits progress in Spoken Language Understanding (SLU). As key tasks like intent classification and slot filling require abundant training data, it is desirable to reuse existing data in high-resource languages to develop models for low-resource scenarios. We introduce xSID, a new benchmark for cross-lingual Slot and Intent Detection in 13 languages from 6 language families, including a very low-resource dialect. To tackle the challenge, we propose a joint learning approach, with English SLU training data and non-English auxiliary tasks from raw text, syntax and translation for transfer. We study two setups which differ by type and language coverage of the pre-trained embeddings. Our results show that jointly learning the main tasks with masked language modeling is effective for slots, while machine translation transfer works best for intent classification.


page 3

page 8


Intent Classification Using Pre-Trained Embeddings For Low Resource Languages

Building Spoken Language Understanding (SLU) systems that do not rely on...

Multilingual and Cross-Lingual Intent Detection from Spoken Data

We present a systematic study on multilingual and cross-lingual intent d...

Learning from Multiple Noisy Augmented Data Sets for Better Cross-Lingual Spoken Language Understanding

Lack of training data presents a grand challenge to scaling out spoken l...

Bootstrapping NLU Models with Multi-task Learning

Bootstrapping natural language understanding (NLU) systems with minimal ...

End-to-End Natural Language Understanding Pipeline for Bangla Conversational Agents

Chatbots are intelligent software built to be used as a replacement for ...

Data balancing for boosting performance of low-frequency classes in Spoken Language Understanding

Despite the fact that data imbalance is becoming more and more common in...