One model, two languages: training bilingual parsers with harmonized treebanks

07/30/2015
by   David Vilares, et al.
0

We introduce an approach to train lexicalized parsers using bilingual corpora obtained by merging harmonized treebanks of different languages, producing parsers that can analyze sentences in either of the learned languages, or even sentences that mix both. We test the approach on the Universal Dependency Treebanks, training with MaltParser and MaltOptimizer. The results show that these bilingual parsers are more than competitive, as most combinations not only preserve accuracy, but some even achieve significant improvements over the corresponding monolingual parsers. Preliminary experiments also show the approach to be promising on texts with code-switching and when more languages are added.

READ FULL TEXT
research
11/06/2018

Code-switching Sentence Generation by Generative Adversarial Networks and its Application to Data Augmentation

Code-switching is about dealing with alternative languages in speech or ...
research
04/26/2022

Developing Universal Dependency Treebanks for Magahi and Braj

In this paper, we discuss the development of treebanks for two low-resou...
research
08/06/2020

Phonological Features for 0-shot Multilingual Speech Synthesis

Code-switching—the intra-utterance use of multiple languages—is prevalen...
research
11/14/2019

Training a code-switching language model with monolingual data

A lack of code-switching data complicates the training of code-switching...
research
09/29/2021

Call Larisa Ivanovna: Code-Switching Fools Multilingual NLU Models

Practical needs of developing task-oriented dialogue assistants require ...
research
05/02/2020

Treebank Embedding Vectors for Out-of-domain Dependency Parsing

A recent advance in monolingual dependency parsing is the idea of a tree...
research
09/22/2022

Spatial model personalization in Gboard

We introduce a framework for adapting a virtual keyboard to individual u...

Please sign up or login with your details

Forgot password? Click here to reset