Translation Word-Level Auto-Completion: What can we achieve out of the box?

by   Yasmin Moslem, et al.

Research on Machine Translation (MT) has achieved important breakthroughs in several areas. While there is much more to be done in order to build on this success, we believe that the language industry needs better ways to take full advantage of current achievements. Due to a combination of factors, including time, resources, and skills, businesses tend to apply pragmatism into their AI workflows. Hence, they concentrate more on outcomes, e.g. delivery, shipping, releases, and features, and adopt high-level working production solutions, where possible. Among the features thought to be helpful for translators are sentence-level and word-level translation auto-suggestion and auto-completion. Suggesting alternatives can inspire translators and limit their need to refer to external resources, which hopefully boosts their productivity. This work describes our submissions to WMT's shared task on word-level auto-completion, for the Chinese-to-English, English-to-Chinese, German-to-English, and English-to-German language directions. We investigate the possibility of using pre-trained models and out-of-the-box features from available libraries. We employ random sampling to generate diverse alternatives, which reveals good results. Furthermore, we introduce our open-source API, based on CTranslate2, to serve translations, auto-suggestions, and auto-completions.


page 1

page 2

page 3

page 4


The University of Edinburgh's Submissions to the WMT19 News Translation Task

The University of Edinburgh participated in the WMT19 Shared Task on New...

Findings of the WMT 2022 Shared Task on Translation Suggestion

We report the result of the first edition of the WMT shared task on Tran...

TSMind: Alibaba and Soochow University's Submission to the WMT22 Translation Suggestion Task

This paper describes the joint submission of Alibaba and Soochow Univers...

PETCI: A Parallel English Translation Dataset of Chinese Idioms

Idioms are an important language phenomenon in Chinese, but idiom transl...

Better Datastore, Better Translation: Generating Datastores from Pre-Trained Models for Nearest Neural Machine Translation

Nearest Neighbor Machine Translation (kNNMT) is a simple and effective m...

Accessing Higher Dimensions for Unsupervised Word Translation

The striking ability of unsupervised word translation has been demonstra...

Please sign up or login with your details

Forgot password? Click here to reset