The Transference Architecture for Automatic Post-Editing

08/16/2019
by   Santanu Pal, et al.
0

In automatic post-editing (APE) it makes sense to condition post-editing (pe) decisions on both the source (src) and the machine translated text (mt) as input. This has led to multi-source encoder based APE approaches. A research challenge now is the search for architectures that best support the capture, preparation and provision of src and mt information and its integration with pe decisions. In this paper we present a new multi-source APE model, called transference. Unlike previous approaches, it (i) uses a transformer encoder block for src, (ii) followed by a decoder block, but without masking for self-attention on mt, which effectively acts as second encoder combining src -> mt, and (iii) feeds this representation into a final decoder block generating pe. Our model outperforms the state-of-the-art by 1 BLEU point on the WMT 2016, 2017, and 2018 English--German APE shared tasks (PBSMT and NMT). We further investigate the importance of our newly introduced second encoder and find that a too small amount of layers does hurt the performance, while reducing the number of layers of the decoder does not matter much.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/15/2019

Transformer-based Automatic Post-Editing with a Context-Aware Encoding Approach for Multi-Source Inputs

Recent approaches to the Automatic Post-Editing (APE) research have show...
research
08/09/2019

UdS Submission for the WMT 19 Automatic Post-Editing Task

In this paper, we describe our submission to the English-German APE shar...
research
06/13/2017

An Exploration of Neural Sequence-to-Sequence Architectures for Automatic Post-Editing

In this work, we explore multiple neural architectures adapted for the t...
research
10/14/2019

Estimating post-editing effort: a study on human judgements, task-based and reference-based metrics of MT quality

Devising metrics to assess translation quality has always been at the co...
research
05/30/2019

Unbabel's Submission to the WMT2019 APE Shared Task: BERT-based Encoder-Decoder for Automatic Post-Editing

This paper describes Unbabel's submission to the WMT2019 APE Shared Task...
research
09/01/2018

MS-UEdin Submission to the WMT2018 APE Shared Task: Dual-Source Transformer for Automatic Post-Editing

This paper describes the Microsoft and University of Edinburgh submissio...
research
09/15/2019

Automatically Extracting Challenge Sets for Non-local Phenomena Neural Machine Translation

We show that the state-of-the-art Transformer MT model is not biased tow...

Please sign up or login with your details

Forgot password? Click here to reset