Simultaneous paraphrasing and translation by fine-tuning Transformer models

05/12/2020
by   Rakesh Chada, et al.
0

This paper describes the third place submission to the shared task on simultaneous translation and paraphrasing for language education at the 4th workshop on Neural Generation and Translation (WNGT) for ACL 2020. The final system leverages pre-trained translation models and uses a Transformer architecture combined with an oversampling strategy to achieve a competitive performance. This system significantly outperforms the baseline on Hungarian (27 absolute improvement) languages.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/16/2020

DiDi's Machine Translation System for WMT2020

This paper describes DiDi AI Labs' submission to the WMT2020 news transl...
research
03/24/2023

SIGMORPHON 2023 Shared Task of Interlinear Glossing: Baseline Model

Language documentation is a critical aspect of language preservation, of...
research
06/21/2019

CUNI System for the WMT19 Robustness Task

We present our submission to the WMT19 Robustness Task. Our baseline sys...
research
05/13/2022

Controlling Translation Formality Using Pre-trained Multilingual Language Models

This paper describes the University of Maryland's submission to the Spec...
research
10/28/2020

The Volctrans Machine Translation System for WMT20

This paper describes our VolcTrans system on WMT20 shared news translati...
research
04/25/2023

KINLP at SemEval-2023 Task 12: Kinyarwanda Tweet Sentiment Analysis

This paper describes the system entered by the author to the SemEval-202...
research
08/16/2019

UDS--DFKI Submission to the WMT2019 Similar Language Translation Shared Task

In this paper we present the UDS-DFKI system submitted to the Similar La...

Please sign up or login with your details

Forgot password? Click here to reset