Cross-Lingual Alignment of Contextual Word Embeddings, with Applications to Zero-shot Dependency Parsing

02/25/2019
by   Tal Schuster, et al.
0

We introduce a novel method for multilingual transfer that utilizes deep contextual embeddings, pretrained in an unsupervised fashion. While contextual embeddings have been shown to yield richer representations of meaning compared to their static counterparts, aligning them poses a challenge due to their dynamic nature. To this end, we construct context-independent variants of the original monolingual spaces and utilize their mapping to derive an alignment for the context-dependent spaces. This mapping readily supports processing of a target language, improving transfer by context-aware embeddings. Our experimental results demonstrate the effectiveness of this approach for zero-shot and few-shot learning of dependency parsing. Specifically, our method consistently outperforms the previous state-of-the-art on 6 target languages, yielding an improvement of 6.8 LAS points on average.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/03/2021

Zero-Shot Cross-Lingual Dependency Parsing through Contextual Embedding Transformation

Linear embedding transformation has been shown to be effective for zero-...
research
09/15/2019

Cross-Lingual BERT Transformation for Zero-Shot Dependency Parsing

This paper investigates the problem of learning cross-lingual representa...
research
06/30/2021

Cross-lingual alignments of ELMo contextual embeddings

Building machine learning prediction models for a specific NLP task requ...
research
05/12/2022

Using dependency parsing for few-shot learning in distributional semantics

In this work, we explore the novel idea of employing dependency parsing ...
research
09/10/2021

Genre as Weak Supervision for Cross-lingual Dependency Parsing

Recent work has shown that monolingual masked language models learn to r...
research
07/19/2021

Cross-Lingual BERT Contextual Embedding Space Mapping with Isotropic and Isometric Conditions

Typically, a linearly orthogonal transformation mapping is learned by al...
research
01/26/2021

Deep Subjecthood: Higher-Order Grammatical Features in Multilingual BERT

We investigate how Multilingual BERT (mBERT) encodes grammar by examinin...

Please sign up or login with your details

Forgot password? Click here to reset