Improving cross-lingual model transfer by chunking

02/27/2020
by   Ayan Das, et al.
0

We present a shallow parser guided cross-lingual model transfer approach in order to address the syntactic differences between source and target languages more effectively. In this work, we assume the chunks or phrases in a sentence as transfer units in order to address the syntactic differences between the source and target languages arising due to the differences in ordering of words in the phrases and the ordering of phrases in a sentence separately.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/05/2019

Cross-Lingual Dependency Parsing Using Code-Mixed TreeBank

Treebank translation is a promising method for cross-lingual transfer of...
research
05/27/2019

Specific polysemy of the brief sapiential units

In this paper we explain how we deal with the problems related to the co...
research
10/09/2020

Investigating Cross-Linguistic Adjective Ordering Tendencies with a Latent-Variable Model

Across languages, multiple consecutive adjectives modifying a noun (e.g....
research
04/16/2020

Towards Instance-Level Parser Selection for Cross-Lingual Transfer of Dependency Parsers

Current methods of cross-lingual parser transfer focus on predicting the...
research
03/30/2023

Fine-Tuning BERT with Character-Level Noise for Zero-Shot Transfer to Dialects and Closely-Related Languages

In this work, we induce character-level noise in various forms when fine...
research
10/19/2016

Cross-Lingual Syntactic Transfer with Limited Resources

We describe a simple but effective method for cross-lingual syntactic tr...
research
05/20/2023

Constructing Code-mixed Universal Dependency Forest for Unbiased Cross-lingual Relation Extraction

Latest efforts on cross-lingual relation extraction (XRE) aggressively l...

Please sign up or login with your details

Forgot password? Click here to reset