Optimizing Transformer for Low-Resource Neural Machine Translation

11/04/2020
by   Ali Araabi, et al.
0

Language pairs with limited amounts of parallel data, also known as low-resource languages, remain a challenge for neural machine translation. While the Transformer model has achieved significant improvements for many language pairs and has become the de facto mainstream architecture, its capability under low-resource conditions has not been fully investigated yet. Our experiments on different subsets of the IWSLT14 training data show that the effectiveness of Transformer under low-resource conditions is highly dependent on the hyper-parameter settings. Our experiments show that using an optimized Transformer for low-resource conditions improves the translation quality up to 7.3 BLEU points compared to using the Transformer default settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/01/2017

Data Augmentation for Low-Resource Neural Machine Translation

The quality of a Neural Machine Translation system depends substantially...
research
10/01/2019

Auto-Sizing the Transformer Network: Improving Speed, Efficiency, and Performance for Low-Resource Machine Translation

Neural sequence-to-sequence models, particularly the Transformer, are th...
research
04/09/2020

On optimal transformer depth for low-resource language translation

Transformers have shown great promise as an approach to Neural Machine T...
research
12/24/2022

Optimizing Deep Transformers for Chinese-Thai Low-Resource Translation

In this paper, we study the use of deep Transformer translation model fo...
research
09/05/2023

Advancing Text-to-GLOSS Neural Translation Using a Novel Hyper-parameter Optimization Technique

In this paper, we investigate the use of transformers for Neural Machine...
research
02/01/2023

Attention Link: An Efficient Attention-Based Low Resource Machine Translation Architecture

Transformers have achieved great success in machine translation, but tra...
research
10/07/2021

Cross-Language Learning for Entity Matching

Transformer-based matching methods have significantly moved the state-of...

Please sign up or login with your details

Forgot password? Click here to reset