Miðeind's WMT 2021 submission

09/15/2021
by   Haukur Barri Símonarson, et al.
0

We present Miðeind's submission for the English→Icelandic and Icelandic→English subsets of the 2021 WMT news translation task. Transformer-base models are trained for translation on parallel data to generate backtranslations iteratively. A pretrained mBART-25 model is then adapted for translation using parallel data as well as the last backtranslation iteration. This adapted pretrained model is then used to re-generate backtranslations, and the training of the adapted model is continued.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2023

Sāmayik: A Benchmark and Dataset for English-Sanskrit Translation

Sanskrit is a low-resource language with a rich heritage. Digitized Sans...
research
06/10/2019

The University of Helsinki submissions to the WMT19 news translation task

In this paper, we present the University of Helsinki submissions to the ...
research
09/23/2021

The Volctrans GLAT System: Non-autoregressive Translation Meets WMT21

This paper describes the Volctrans' submission to the WMT21 news transla...
research
10/29/2020

Tilde at WMT 2020: News Task Systems

This paper describes Tilde's submission to the WMT2020 shared task on ne...
research
07/26/2018

A Better Baseline for AVA

We introduce a simple baseline for action localization on the AVA datase...
research
10/31/2022

Where to start? Analyzing the potential value of intermediate models

Previous studies observed that finetuned models may be better base model...
research
05/27/2020

MT-Adapted Datasheets for Datasets: Template and Repository

In this report we are taking the standardized model proposed by Gebru et...

Please sign up or login with your details

Forgot password? Click here to reset