Sampling and Filtering of Neural Machine Translation Distillation Data

04/01/2021
by   Vilém Zouhar, et al.
0

In most of neural machine translation distillation or stealing scenarios, the goal is to preserve the performance of the target model (teacher). The highest-scoring hypothesis of the teacher model is commonly used to train a new model (student). If reference translations are also available, then better hypotheses (with respect to the references) can be upsampled and poor hypotheses either removed or undersampled. This paper explores the importance sampling method landscape (pruning, hypothesis upsampling and undersampling, deduplication and their combination) with English to Czech and English to German MT models using standard MT evaluation metrics. We show that careful upsampling and combination with the original data leads to better performance when compared to training only on the original or synthesized data or their direct combination.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/06/2017

Ensemble Distillation for Neural Machine Translation

Knowledge distillation describes a method for training a student network...
research
12/16/2021

Isometric MT: Neural Machine Translation for Automatic Dubbing

Automatic dubbing (AD) is among the use cases where translations should ...
research
12/06/2022

Life-long Learning for Multilingual Neural Machine Translation with Knowledge Distillation

A common scenario of Multilingual Neural Machine Translation (MNMT) is t...
research
12/31/2020

Exploring Monolingual Data for Neural Machine Translation with Knowledge Distillation

We explore two types of monolingual data that can be included in knowled...
research
06/12/2021

Guiding Teacher Forcing with Seer Forcing for Neural Machine Translation

Although teacher forcing has become the main training paradigm for neura...
research
12/02/2019

Language Model Bootstrapping Using Neural Machine Translation For Conversational Speech Recognition

Building conversational speech recognition systems for new languages is ...
research
07/17/2023

Enhancing Supervised Learning with Contrastive Markings in Neural Machine Translation Training

Supervised learning in Neural Machine Translation (NMT) typically follow...

Please sign up or login with your details

Forgot password? Click here to reset