Word Alignment by Fine-tuning Embeddings on Parallel Corpora

01/20/2021
by   Zi-Yi Dou, et al.
0

Word alignment over parallel corpora has a wide variety of applications, including learning translation lexicons, cross-lingual transfer of language processing tools, and automatic evaluation or analysis of translation outputs. The great majority of past work on word alignment has worked by performing unsupervised learning on parallel texts. Recently, however, other work has demonstrated that pre-trained contextualized word embeddings derived from multilingually trained language models (LMs) prove an attractive alternative, achieving competitive results on the word alignment task even in the absence of explicit training on parallel data. In this paper, we examine methods to marry the two approaches: leveraging pre-trained LMs but fine-tuning them on parallel text with objectives designed to improve alignment quality, and proposing methods to effectively extract alignments from these fine-tuned models. We perform experiments on five language pairs and demonstrate that our model can consistently outperform previous state-of-the-art models of all varieties. In addition, we demonstrate that we are able to train multilingual word aligners that can obtain robust performance on different language pairs. Our aligner, AWESOME (Aligning Word Embedding Spaces of Multilingual Encoders), with pre-trained models is available at https://github.com/neulab/awesome-align

READ FULL TEXT

page 16

page 17

research
10/31/2018

Aligning Very Small Parallel Corpora Using Cross-Lingual Word Embeddings and a Monogamy Objective

Count-based word alignment methods, such as the IBM models or fast-align...
research
11/08/2022

Third-Party Aligner for Neural Word Alignments

Word alignment is to find translationally equivalent words between sourc...
research
05/23/2022

Utilizing Language-Image Pretraining for Efficient and Robust Bilingual Word Alignment

Word translation without parallel corpora has become feasible, rivaling ...
research
10/14/2019

Updating Pre-trained Word Vectors and Text Classifiers using Monolingual Alignment

In this paper, we focus on the problem of adapting word vector-based mod...
research
10/06/2021

Using Optimal Transport as Alignment Objective for fine-tuning Multilingual Contextualized Embeddings

Recent studies have proposed different methods to improve multilingual w...
research
01/31/2022

Constrained Density Matching and Modeling for Cross-lingual Alignment of Contextualized Representations

Multilingual representations pre-trained with monolingual data exhibit c...
research
07/18/2020

On a Novel Application of Wasserstein-Procrustes for Unsupervised Cross-Lingual Learning

The emergence of unsupervised word embeddings, pre-trained on very large...

Please sign up or login with your details

Forgot password? Click here to reset