CUNI System for the WMT19 Robustness Task

06/21/2019
by   Jindřich Helcl, et al.
0

We present our submission to the WMT19 Robustness Task. Our baseline system is the Charles University (CUNI) Transformer system trained for the WMT18 shared task on News Translation. Quantitative results show that the CUNI Transformer system is already far more robust to noisy input than the LSTM-based baseline provided by the task organizers. We further improved the performance of our model by fine-tuning on the in-domain noisy data without influencing the translation quality on the news domain.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/16/2020

DiDi's Machine Translation System for WMT2020

This paper describes DiDi AI Labs' submission to the WMT2020 news transl...
research
07/09/2019

NTT's Machine Translation Systems for WMT19 Robustness Task

This paper describes NTT's submission to the WMT19 robustness task. This...
research
05/12/2020

Simultaneous paraphrasing and translation by fine-tuning Transformer models

This paper describes the third place submission to the shared task on si...
research
10/29/2020

Tilde at WMT 2020: News Task Systems

This paper describes Tilde's submission to the WMT2020 shared task on ne...
research
12/28/2021

Automatic Pharma News Categorization

We use a text dataset consisting of 23 news categories relevant to pharm...
research
11/16/2021

NVIDIA NeMo Neural Machine Translation Systems for English-German and English-Russian News and Biomedical Tasks at WMT21

This paper provides an overview of NVIDIA NeMo's neural machine translat...
research
08/31/2018

Cognate-aware morphological segmentation for multilingual neural translation

This article describes the Aalto University entry to the WMT18 News Tran...

Please sign up or login with your details

Forgot password? Click here to reset