Code Switching Language Model Using Monolingual Training Data

12/23/2020
by   Asad Ullah, et al.
8

Training a code-switching (CS) language model using only monolingual data is still an ongoing research problem. In this paper, a CS language model is trained using only monolingual training data. As recurrent neural network (RNN) models are best suited for predicting sequential data. In this work, an RNN language model is trained using alternate batches from only monolingual English and Spanish data and the perplexity of the language model is computed. From the results, it is concluded that using alternate batches of monolingual data in training reduced the perplexity of a CS language model. The results were consistently improved using mean square error (MSE) in the output embeddings of RNN based language model. By combining both methods, perplexity is reduced from 299.63 to 80.38. The proposed methods were comparable to the language model fine tune with code-switch training data.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

11/14/2019

Training a code-switching language model with monolingual data

A lack of code-switching data complicates the training of code-switching...
07/28/2018

Acoustic and Textual Data Augmentation for Improved ASR of Code-Switching Speech

In this paper, we describe several techniques for improving the acoustic...
10/12/2021

Deep Learning for Bias Detection: From Inception to Deployment

To create a more inclusive workplace, enterprises are actively investing...
12/19/2021

Integrating Knowledge in End-to-End Automatic Speech Recognition for Mandarin-English Code-Switching

Code-Switching (CS) is a common linguistic phenomenon in multilingual co...
10/24/2018

Learn to Code-Switch: Data Augmentation using Copy Mechanism on Language Modeling

Building large-scale datasets for training code-switching language model...
08/30/2018

Direct Output Connection for a High-Rank Language Model

This paper proposes a state-of-the-art recurrent neural network (RNN) la...
05/24/2021

RobeCzech: Czech RoBERTa, a monolingual contextualized language representation model

We present RobeCzech, a monolingual RoBERTa language representation mode...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.