To What Degree Can Language Borders Be Blurred In BERT-based Multilingual Spoken Language Understanding?

11/10/2020
by   Quynh Do, et al.
0

This paper addresses the question as to what degree a BERT-based multilingual Spoken Language Understanding (SLU) model can transfer knowledge across languages. Through experiments we will show that, although it works substantially well even on distant language groups, there is still a gap to the ideal multilingual performance. In addition, we propose a novel BERT-based adversarial model architecture to learn language-shared and language-specific representations for multilingual SLU. Our experimental results prove that the proposed model is capable of narrowing the gap to the ideal multilingual performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/04/2019

How multilingual is Multilingual BERT?

In this paper, we show that Multilingual BERT (M-BERT), released by Devl...
research
09/15/2021

Learning to Match Job Candidates Using Multilingual Bi-Encoder BERT

In this talk, we will show how we used Randstad history of candidate pla...
research
03/01/2022

BERT-LID: Leveraging BERT to Improve Spoken Language Identification

Language identification is a task of automatically determining the ident...
research
05/31/2023

Multilingual Multi-Figurative Language Detection

Figures of speech help people express abstract concepts and evoke strong...
research
07/03/2023

Semantic enrichment towards efficient speech representations

Over the past few years, self-supervised learned speech representations ...
research
10/30/2018

Spoken Language Understanding on the Edge

We consider the problem of performing Spoken Language Understanding (SLU...
research
09/17/2020

Multi^2OIE: Multilingual Open Information Extraction based on Multi-Head Attention with BERT

In this paper, we propose Multi^2OIE, which performs open information ex...

Please sign up or login with your details

Forgot password? Click here to reset