Densifying Assumed-sparse Tensors: Improving Memory Efficiency and MPI Collective Performance during Tensor Accumulation for Parallelized Training of Neural Machine Translation

05/10/2019
by   Derya Cavdar, et al.
0

Neural machine translation - using neural networks to translate human language - is an area of active research exploring new neuron types and network topologies with the goal of dramatically improving machine translation performance. Current state-of-the-art approaches, such as the multi-head attention-based transformer, require very large translation corpuses and many epochs to produce models of reasonable quality. Recent attempts to parallelize the official TensorFlow "Transformer" model across multiple nodes have hit roadblocks due to excessive memory use and resulting out of memory errors when performing MPI collectives. This paper describes modifications made to the Horovod MPI-based distributed training framework to reduce memory usage for transformer models by converting assumed-sparse tensors to dense tensors, and subsequently replacing sparse gradient gather with dense gradient reduction. The result is a dramatic increase in scale-out capability, with CPU-only scaling tests achieving 91 (300 nodes), and up to 65 (200 nodes) using the Stampede2 supercomputer.

READ FULL TEXT
research
10/10/2018

Exploring the Use of Attention within an Neural Machine Translation Decoder States to Translate Idioms

Idioms pose problems to almost all Machine Translation systems. This typ...
research
10/06/2020

Efficient Inference For Neural Machine Translation

Large Transformer models have achieved state-of-the-art results in neura...
research
11/05/2018

Compact Personalized Models for Neural Machine Translation

We propose and compare methods for gradient-based domain adaptation of s...
research
05/30/2018

Marian: Cost-effective High-Quality Neural Machine Translation in C++

This paper describes the submissions of the "Marian" team to the WNMT 20...
research
10/29/2018

Parallel Attention Mechanisms in Neural Machine Translation

Recent papers in neural machine translation have proposed the strict use...
research
04/01/2018

Training Tips for the Transformer Model

This article describes our experiments in neural machine translation usi...

Please sign up or login with your details

Forgot password? Click here to reset