The University of Cambridge's Machine Translation Systems for WMT18

08/28/2018
by   Felix Stahlberg, et al.
0

The University of Cambridge submission to the WMT18 news translation task focuses on the combination of diverse models of translation. We compare recurrent, convolutional, and self-attention-based neural models on German-English, English-German, and Chinese-English. Our final system combines all neural models together with a phrase-based SMT system in an MBR-based scheme. We report small but consistent gains on top of strong Transformer ensembles.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/10/2019

The University of Helsinki submissions to the WMT19 news translation task

In this paper, we present the University of Helsinki submissions to the ...
research
05/16/2016

The AMU-UEDIN Submission to the WMT16 News Translation Task: Attention-based NMT Models as Feature Functions in Phrase-based SMT

This paper describes the AMU-UEDIN submissions to the WMT 2016 shared ta...
research
01/05/2016

Multi-Source Neural Translation

We build a multi-source machine translation model and train it to maximi...
research
08/02/2017

The University of Edinburgh's Neural MT Systems for WMT17

This paper describes the University of Edinburgh's submissions to the WM...
research
10/12/2021

LightSeq2: Accelerated Training for Transformer-based Models on GPUs

Transformer-based neural models are used in many AI applications. Traini...
research
08/05/2017

A Comparison of Neural Models for Word Ordering

We compare several language models for the word-ordering task and propos...
research
07/14/2020

Modeling Voting for System Combination in Machine Translation

System combination is an important technique for combining the hypothese...

Please sign up or login with your details

Forgot password? Click here to reset