Cross-Lingual BERT Transformation for Zero-Shot Dependency Parsing

09/15/2019
by   Yuxuan Wang, et al.
0

This paper investigates the problem of learning cross-lingual representations in a contextual space. We propose Cross-Lingual BERT Transformation (CLBT), a simple and efficient approach to generate cross-lingual contextualized word embeddings based on publicly available pre-trained BERT models (Devlin et al., 2018). In this approach, a linear transformation is learned from contextual word alignments to align the contextualized embeddings independently trained in different languages. We demonstrate the effectiveness of this approach on zero-shot cross-lingual transfer parsing. Experiments show that our embeddings substantially outperform the previous state-of-the-art that uses static embeddings. We further compare our approach with XLM (Lample and Conneau, 2019), a recently proposed cross-lingual language model trained with massive parallel data, and achieve highly competitive results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/03/2021

Zero-Shot Cross-Lingual Dependency Parsing through Contextual Embedding Transformation

Linear embedding transformation has been shown to be effective for zero-...
research
04/19/2019

Beto, Bentz, Becas: The Surprising Cross-Lingual Effectiveness of BERT

Pretrained contextual representation models (Peters et al., 2018; Devlin...
research
02/25/2019

Cross-Lingual Alignment of Contextual Word Embeddings, with Applications to Zero-shot Dependency Parsing

We introduce a novel method for multilingual transfer that utilizes deep...
research
12/09/2020

Cross-lingual Word Sense Disambiguation using mBERT Embeddings with Syntactic Dependencies

Cross-lingual word sense disambiguation (WSD) tackles the challenge of d...
research
09/10/2021

Examining Cross-lingual Contextual Embeddings with Orthogonal Structural Probes

State-of-the-art contextual embeddings are obtained from large language ...
research
09/10/2021

Genre as Weak Supervision for Cross-lingual Dependency Parsing

Recent work has shown that monolingual masked language models learn to r...
research
04/27/2022

LyS_ACoruña at SemEval-2022 Task 10: Repurposing Off-the-Shelf Tools for Sentiment Analysis as Semantic Dependency Parsing

This paper addressed the problem of structured sentiment analysis using ...

Please sign up or login with your details

Forgot password? Click here to reset