On the cross-lingual transferability of multilingual prototypical models across NLU tasks

07/19/2022
by   Oralie Cattan, et al.
0

Supervised deep learning-based approaches have been applied to task-oriented dialog and have proven to be effective for limited domain and language applications when a sufficient number of training examples are available. In practice, these approaches suffer from the drawbacks of domain-driven design and under-resourced languages. Domain and language models are supposed to grow and change as the problem space evolves. On one hand, research on transfer learning has demonstrated the cross-lingual ability of multilingual Transformers-based models to learn semantically rich representations. On the other, in addition to the above approaches, meta-learning have enabled the development of task and language learning algorithms capable of far generalization. Through this context, this article proposes to investigate the cross-lingual transferability of using synergistically few-shot learning with prototypical neural networks and multilingual Transformers-based models. Experiments in natural language understanding tasks on MultiATIS++ corpus shows that our approach substantially improves the observed transfer learning performances between the low and the high resource languages. More generally our approach confirms that the meaningful latent space learned in a given language can be can be generalized to unseen and under-resourced ones using meta-learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/20/2021

X-METRA-ADA: Cross-lingual Meta-Transfer Learning Adaptation to Natural Language Understanding and Question Answering

Multilingual models, such as M-BERT and XLM-R, have gained increasing po...
research
11/10/2021

Cross-lingual Adaption Model-Agnostic Meta-Learning for Natural Language Understanding

Meta learning with auxiliary languages has demonstrated promising improv...
research
05/15/2023

Measuring Cross-Lingual Transferability of Multilingual Transformers on Sentence Classification

Recent studies have exhibited remarkable capabilities of pre-trained mul...
research
06/27/2022

Few-Shot Cross-Lingual TTS Using Transferable Phoneme Embedding

This paper studies a transferable phoneme embedding framework that aims ...
research
05/24/2023

Boosting Cross-lingual Transferability in Multilingual Models via In-Context Learning

Existing cross-lingual transfer (CLT) prompting methods are only concern...
research
03/08/2021

Meta-Learning with MAML on Trees

In meta-learning, the knowledge learned from previous tasks is transferr...
research
12/15/2021

AllWOZ: Towards Multilingual Task-Oriented Dialog Systems for All

A commonly observed problem of the state-of-the-art natural language tec...

Please sign up or login with your details

Forgot password? Click here to reset