The Ubiqus English-Inuktitut System for WMT20

11/18/2020
by   François Hernandez, et al.
0

This paper describes Ubiqus' submission to the WMT20 English-Inuktitut shared news translation task. Our main system, and only submission, is based on a multilingual approach, jointly training a Transformer model on several agglutinative languages. The English-Inuktitut translation task is challenging at every step, from data selection, preparation and tokenization to quality evaluation down the line. Difficulties emerge both because of the peculiarities of the Inuktitut language as well as the low-resource context.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/20/2021

CUNI systems for WMT21: Multilingual Low-Resource Translation for Indo-European Languages Shared Task

This paper describes Charles University submission for Multilingual Low-...
research
08/31/2018

Cognate-aware morphological segmentation for multilingual neural translation

This article describes the Aalto University entry to the WMT18 News Tran...
research
10/11/2022

Enriching Biomedical Knowledge for Low-resource Language Through Translation

Biomedical data and benchmarks are highly valuable yet very limited in l...
research
09/20/2021

CUNI systems for WMT21: Terminology translation Shared Task

This paper describes Charles University submission for Terminology trans...
research
07/07/2023

DWReCO at CheckThat! 2023: Enhancing Subjectivity Detection through Style-based Data Sampling

This paper describes our submission for the subjectivity detection task ...
research
11/16/2020

Facebook AI's WMT20 News Translation Task Submission

This paper describes Facebook AI's submission to WMT20 shared news trans...
research
11/26/2021

Ensembling of Distilled Models from Multi-task Teachers for Constrained Resource Language Pairs

This paper describes our submission to the constrained track of WMT21 sh...

Please sign up or login with your details

Forgot password? Click here to reset