Call Larisa Ivanovna: Code-Switching Fools Multilingual NLU Models

09/29/2021
by   Alexey Birshert, et al.
0

Practical needs of developing task-oriented dialogue assistants require the ability to understand many languages. Novel benchmarks for multilingual natural language understanding (NLU) include monolingual sentences in several languages, annotated with intents and slots. In such setup models for cross-lingual transfer show remarkable performance in joint intent recognition and slot filling. However, existing benchmarks lack of code-switched utterances, which are difficult to gather and label due to complexity in the grammatical structure. The evaluation of NLU models seems biased and limited, since code-switching is being left out of scope. Our work adopts recognized methods to generate plausible and naturally-sounding code-switched utterances and uses them to create a synthetic code-switched test set. Based on experiments, we report that the state-of-the-art NLU models are unable to handle code-switching. At worst, the performance, evaluated by semantic accuracy, drops as low as 15% from 80% across languages. Further we show, that pre-training on synthetic code-mixed data helps to maintain performance on the proposed test set at a comparable level with monolingual data. Finally, we analyze different language pairs and show that the closer the languages are, the better the NLU model handles their alternation. This is in line with the common understanding of how multilingual models conduct transferring between languages

READ FULL TEXT

page 10

page 16

page 17

page 18

page 19

page 20

page 21

page 22

research
03/13/2021

Multilingual Code-Switching for Zero-Shot Cross-Lingual Intent Prediction and Slot Filling

Predicting user intent and detecting the corresponding slots from text a...
research
05/07/2022

Multi-level Contrastive Learning for Cross-lingual Spoken Language Understanding

Although spoken language understanding (SLU) has achieved great success ...
research
03/24/2021

Are Multilingual Models Effective in Code-Switching?

Multilingual language models have shown decent performance in multilingu...
research
03/17/2021

Code-Mixing on Sesame Street: Dawn of the Adversarial Polyglots

Multilingual models have demonstrated impressive cross-lingual transfer ...
research
07/30/2015

One model, two languages: training bilingual parsers with harmonized treebanks

We introduce an approach to train lexicalized parsers using bilingual co...
research
06/21/2019

A Deep Generative Model for Code-Switched Text

Code-switching, the interleaving of two or more languages within a sente...

Please sign up or login with your details

Forgot password? Click here to reset