Continuous Space Reordering Models for Phrase-based MT

01/25/2018
by   Nadir Durrani, et al.
0

Bilingual sequence models improve phrase-based translation and reordering by overcoming phrasal independence assumption and handling long range reordering. However, due to data sparsity, these models often fall back to very small context sizes. This problem has been previously addressed by learning sequences over generalized representations such as POS tags or word clusters. In this paper, we explore an alternative based on neural network models. More concretely we train neuralized versions of lexicalized reordering and the operation sequence models using feed-forward neural network. Our results show improvements of up to 0.6 and 0.5 BLEU points on top of the baseline German->English and English->German systems. We also observed improvements compared to the systems that used POS tags and word clusters to train these models. Because we modify the bilingual corpus to integrate reordering operations, this allows us to also train a sequence-to-sequence neural MT model having explicit reordering triggers. Our motivation was to directly enable reordering information in the encoder-decoder framework, which otherwise relies solely on the attention model to handle long range reordering. We tried both coarser and fine-grained reordering operations. However, these experiments did not yield any improvements over the baseline Neural MT systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/02/2016

Character-based Neural Machine Translation

Neural Machine Translation (MT) has reached state-of-the-art results. Ho...
research
06/09/2016

Linguistic Input Features Improve Neural Machine Translation

Neural machine translation has recently achieved impressive results, whi...
research
01/14/2017

QCRI Machine Translation Systems for IWSLT 16

This paper describes QCRI's machine translation systems for the IWSLT 20...
research
03/04/2021

An empirical analysis of phrase-based and neural machine translation

Two popular types of machine translation (MT) are phrase-based and neura...
research
09/12/2017

SYSTRAN Purely Neural MT Engines for WMT2017

This paper describes SYSTRAN's systems submitted to the WMT 2017 shared ...
research
01/04/2016

Mutual Information and Diverse Decoding Improve Neural Machine Translation

Sequence-to-sequence neural translation models learn semantic and syntac...
research
09/15/2019

Automatically Extracting Challenge Sets for Non local Phenomena in Neural Machine Translation

We show that the state of the art Transformer Machine Translation(MT) mo...

Please sign up or login with your details

Forgot password? Click here to reset