Robust Open-Vocabulary Translation from Visual Text Representations

by   Elizabeth Salesky, et al.

Machine translation models have discrete vocabularies and commonly use subword segmentation techniques to achieve an 'open-vocabulary.' This approach relies on consistent and correct underlying unicode sequences, and makes models susceptible to degradation from common types of noise and variation. Motivated by the robustness of human language processing, we propose the use of visual text representations, which dispense with a finite set of text embeddings in favor of continuous vocabularies created by processing visually rendered text. We show that models using visual text representations approach or match performance of text baselines on clean TED datasets. More importantly, models with visual embeddings demonstrate significant robustness to varied types of noise, achieving e.g., 25.9 BLEU on a character permuted German–English task where subword models degrade to 1.9.


page 2

page 3

page 12


Neural Machine Translation of Rare Words with Subword Units

Neural machine translation (NMT) models typically operate with a fixed v...

Achieving Open Vocabulary Neural Machine Translation with Hybrid Word-Character Models

Nearly all previous work on neural machine translation (NMT) has used qu...

DebiasingWord Embeddings Improves Multimodal Machine Translation

In recent years, pretrained word embeddings have proved useful for multi...

Training on Synthetic Noise Improves Robustness to Natural Noise in Machine Translation

We consider the problem of making machine translation more robust to cha...

VALHALLA: Visual Hallucination for Machine Translation

Designing better machine translation systems by considering auxiliary in...

On Target Segmentation for Direct Speech Translation

Recent studies on direct speech translation show continuous improvements...

Linguistic Features of Genre and Method Variation in Translation: A Computational Perspective

In this paper we describe the use of text classification methods to inve...