How Transformer Revitalizes Character-based Neural Machine Translation: An Investigation on Japanese-Vietnamese Translation Systems

10/05/2019
by   Thi-Vinh Ngo, et al.
0

While translating between Chinese-centric languages, many works have discovered clear advantages of using characters as the translation unit. Unfortunately, traditional recurrent neural machine translation systems hinder the practical usage of those character-based systems due to their architectural limitations. They are unfavorable in handling extremely long sequences as well as highly restricted in parallelizing the computations. In this paper, we demonstrate that the new transformer architecture can perform character-based translation better than the recurrent one. We conduct experiments on a low-resource language pair: Japanese-Vietnamese. Our models considerably outperform the state-of-the-art systems which employ word-based recurrent architectures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/05/2022

Revisiting Syllables in Language Modelling and their Application on Low-Resource Machine Translation

Language modelling and machine translation tasks mostly use subword or c...
research
03/19/2016

A Character-Level Decoder without Explicit Segmentation for Neural Machine Translation

The existing machine translation systems, whether phrase-based or neural...
research
04/30/2020

Character-Level Translation with Self-attention

We explore the suitability of self-attention models for character-level ...
research
03/04/2020

Evaluating Low-Resource Machine Translation between Chinese and Vietnamese with Back-Translation

Back translation (BT) has been widely used and become one of standard te...
research
11/09/2019

A Reinforced Generation of Adversarial Samples for Neural Machine Translation

Neural machine translation systems tend to fail on less de-cent inputs d...
research
07/08/2019

An Intrinsic Nearest Neighbor Analysis of Neural Machine Translation Architectures

Earlier approaches indirectly studied the information captured by the hi...
research
09/05/2023

Advancing Text-to-GLOSS Neural Translation Using a Novel Hyper-parameter Optimization Technique

In this paper, we investigate the use of transformers for Neural Machine...

Please sign up or login with your details

Forgot password? Click here to reset