Compositional Representation of Morphologically-Rich Input for Neural Machine Translation

05/05/2018
by   Duygu Ataman, et al.
0

Neural machine translation (NMT) models are typically trained with fixed-size input and output vocabularies, which creates an important bottleneck on their accuracy and generalization capability. As a solution, various studies proposed segmenting words into sub-word units and performing translation at the sub-lexical level. However, statistical word segmentation methods have recently shown to be prone to morphological errors, which can lead to inaccurate translations. In this paper, we propose to overcome this problem by replacing the source-language embedding layer of NMT with a bi-directional recurrent neural network that generates compositional representations of the input at any desired level of granularity. We test our approach in a low-resource setting with five languages from different morphological typologies, and under different composition assumptions. By training NMT to compose word representations from character n-grams, our approach consistently outperforms (from 1.71 to 2.48 BLEU points) NMT learning embeddings of statistically generated sub-word units.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/15/2019

On the Importance of Word Boundaries in Character-level Neural Machine Translation

Neural Machine Translation (NMT) models generally perform translation us...
research
07/31/2017

Linguistically Motivated Vocabulary Reduction for Neural Machine Translation from Turkish to English

The necessity of using a fixed-size word vocabulary in order to control ...
research
10/30/2019

A Latent Morphology Model for Open-Vocabulary Neural Machine Translation

Translation into morphologically-rich languages challenges neural machin...
research
09/13/2020

Combining Word and Character Vector Representation on Neural Machine Translation

This paper describes combinations of word vector representation and char...
research
10/02/2018

Optimally Segmenting Inputs for NMT Shows Preference for Character-Level Processing

Most modern neural machine translation (NMT) systems rely on presegmente...
research
10/13/2022

Categorizing Semantic Representations for Neural Machine Translation

Modern neural machine translation (NMT) models have achieved competitive...
research
10/02/2018

Learning to Segment Inputs for NMT Favors Character-Level Processing

Most modern neural machine translation (NMT) systems rely on presegmente...

Please sign up or login with your details

Forgot password? Click here to reset