MS-UEdin Submission to the WMT2018 APE Shared Task: Dual-Source Transformer for Automatic Post-Editing

09/01/2018
by   Marcin Junczys-Dowmunt, et al.
0

This paper describes the Microsoft and University of Edinburgh submission to the Automatic Post-editing shared task at WMT2018. Based on training data and systems from the WMT2017 shared task, we re-implement our own models from the last shared task and introduce improvements based on extensive parameter sharing. Next we experiment with our implementation of dual-source transformer models and data selection for the IT domain. Our submissions decisively wins the SMT post-editing sub-task establishing the new state-of-the-art and is a very close second (or equal, 16.46 vs 16.50 TER) in the NMT sub-task. Based on the rather weak results in the NMT sub-task, we hypothesize that neural-on-neural APE might not be actually useful.

READ FULL TEXT
research
08/09/2019

UdS Submission for the WMT 19 Automatic Post-Editing Task

In this paper, we describe our submission to the English-German APE shar...
research
07/17/2017

LIG-CRIStAL System for the WMT17 Automatic Post-Editing Task

This paper presents the LIG-CRIStAL submission to the shared Automatic P...
research
09/30/2020

Can Automatic Post-Editing Improve NMT?

Automatic post-editing (APE) aims to improve machine translations, there...
research
09/17/2021

The JHU-Microsoft Submission for WMT21 Quality Estimation Shared Task

This paper presents the JHU-Microsoft joint submission for WMT 2021 qual...
research
09/12/2021

Levenshtein Training for Word-level Quality Estimation

We propose a novel scheme to use the Levenshtein Transformer to perform ...
research
10/12/2020

Post-Training BatchNorm Recalibration

We revisit non-blocking simultaneous multithreading (NB-SMT) introduced ...
research
08/16/2019

The Transference Architecture for Automatic Post-Editing

In automatic post-editing (APE) it makes sense to condition post-editing...

Please sign up or login with your details

Forgot password? Click here to reset