CUNI Non-Autoregressive System for the WMT 22 Efficient Translation Shared Task

12/01/2022
by   Jindřich Helcl, et al.
0

We present a non-autoregressive system submission to the WMT 22 Efficient Translation Shared Task. Our system was used by Helcl et al. (2022) in an attempt to provide fair comparison between non-autoregressive and autoregressive models. This submission is an effort to establish solid baselines along with sound evaluation methodology, particularly in terms of measuring the decoding speed. The model itself is a 12-layer Transformer model trained with connectionist temporal classification on knowledge-distilled dataset by a strong autoregressive teacher model.

READ FULL TEXT

page 1

page 2

page 3

research
05/04/2022

Non-Autoregressive Machine Translation: It's Not as Fast as it Seems

Efficient machine translation models are commercially important as they ...
research
09/23/2021

The Volctrans GLAT System: Non-autoregressive Translation Meets WMT21

This paper describes the Volctrans' submission to the WMT21 news transla...
research
09/16/2019

Global Autoregressive Models for Data-Efficient Sequence Learning

Standard autoregressive seq2seq models are easily trained by max-likelih...
research
06/18/2020

Deep Encoder, Shallow Decoder: Reevaluating the Speed-Quality Tradeoff in Machine Translation

State-of-the-art neural machine translation models generate outputs auto...
research
03/21/2021

Non-Autoregressive Translation by Learning Target Categorical Codes

Non-autoregressive Transformer is a promising text generation model. How...
research
10/12/2022

Non-Autoregressive Machine Translation with Translation Memories

Non-autoregressive machine translation (NAT) has recently made great pro...
research
10/19/2020

Hierarchical Autoregressive Modeling for Neural Video Compression

Recent work by Marino et al. (2020) showed improved performance in seque...

Please sign up or login with your details

Forgot password? Click here to reset