Cross-lingual alignments of ELMo contextual embeddings

06/30/2021
by   Matej Ulčar, et al.
0

Building machine learning prediction models for a specific NLP task requires sufficient training data, which can be difficult to obtain for low-resource languages. Cross-lingual embeddings map word embeddings from a low-resource language to a high-resource language so that a prediction model trained on data from the high-resource language can also be used in the low-resource language. To produce cross-lingual mappings of recent contextual embeddings, anchor points between the embedding spaces have to be words in the same context. We address this issue with a new method for creating datasets for cross-lingual contextual alignments. Based on that, we propose novel cross-lingual mapping methods for ELMo embeddings. Our linear mapping methods use existing vecmap and MUSE alignments on contextual ELMo embeddings. Our new nonlinear ELMoGAN mapping method is based on GANs and does not assume isomorphic embedding spaces. We evaluate the proposed mapping methods on nine languages, using two downstream tasks, NER and dependency parsing. The ELMoGAN method performs well on the NER task, with low cross-lingual loss compared to direct training on some languages. In the dependency parsing, linear alignment variants are more successful.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/19/2022

Cross-lingual Inflection as a Data Augmentation Method for Parsing

We propose a morphology-based method for low-resource (LR) dependency pa...
research
03/30/2018

Robust Cross-lingual Hypernymy Detection using Dependency Context

Cross-lingual Hypernymy Detection involves determining if a word in one ...
research
12/31/2020

Beyond Offline Mapping: Learning Cross Lingual Word Embeddings through Context Anchoring

Recent research on cross-lingual word embeddings has been dominated by u...
research
09/17/2020

More Embeddings, Better Sequence Labelers?

Recent work proposes a family of contextual embeddings that significantl...
research
07/06/2019

Best Practices for Learning Domain-Specific Cross-Lingual Embeddings

Cross-lingual embeddings aim to represent words in multiple languages in...
research
06/17/2020

Building Low-Resource NER Models Using Non-Speaker Annotation

In low-resource natural language processing (NLP), the key problem is a ...
research
02/25/2019

Cross-Lingual Alignment of Contextual Word Embeddings, with Applications to Zero-shot Dependency Parsing

We introduce a novel method for multilingual transfer that utilizes deep...

Please sign up or login with your details

Forgot password? Click here to reset