SYSTRAN Purely Neural MT Engines for WMT2017

09/12/2017
by   Yongchao Deng, et al.
0

This paper describes SYSTRAN's systems submitted to the WMT 2017 shared news translation task for English-German, in both translation directions. Our systems are built using OpenNMT, an open-source neural machine translation system, implementing sequence-to-sequence models with LSTM encoder/decoders and attention. We experimented using monolingual data automatically back-translated. Our resulting models are further hyper-specialised with an adaptation technique that finely tunes models according to the evaluation test sentences.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/14/2017

LIUM Machine Translation Systems for WMT17 News Translation Task

This paper describes LIUM submissions to WMT17 News Translation Task for...
research
08/02/2017

The University of Edinburgh's Neural MT Systems for WMT17

This paper describes the University of Edinburgh's submissions to the WM...
research
01/04/2016

Mutual Information and Diverse Decoding Improve Neural Machine Translation

Sequence-to-sequence neural translation models learn semantic and syntac...
research
09/01/2018

LIUM-CVC Submissions for WMT18 Multimodal Translation Task

This paper describes the multimodal Neural Machine Translation systems d...
research
09/04/2019

SAO WMT19 Test Suite: Machine Translation of Audit Reports

This paper describes a machine translation test set of documents from th...
research
09/07/2022

Adam Mickiewicz University at WMT 2022: NER-Assisted and Quality-Aware Neural Machine Translation

This paper presents Adam Mickiewicz University's (AMU) submissions to th...
research
01/25/2018

Continuous Space Reordering Models for Phrase-based MT

Bilingual sequence models improve phrase-based translation and reorderin...

Please sign up or login with your details

Forgot password? Click here to reset