An In-depth Walkthrough on Evolution of Neural Machine Translation

04/10/2020
by   Rohan Jagtap, et al.
0

Neural Machine Translation (NMT) methodologies have burgeoned from using simple feed-forward architectures to the state of the art; viz. BERT model. The use cases of NMT models have been broadened from just language translations to conversational agents (chatbots), abstractive text summarization, image captioning, etc. which have proved to be a gem in their respective applications. This paper aims to study the major trends in Neural Machine Translation, the state of the art models in the domain and a high level comparison between them.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/16/2018

Tensor2Tensor for Neural Machine Translation

Tensor2Tensor is a library for deep learning models that is well-suited ...
research
11/13/2018

Towards Neural Machine Translation for African Languages

Given that South African education is in crisis, strategies for improvem...
research
07/09/2018

NMT-Keras: a Very Flexible Toolkit with a Focus on Interactive NMT and Online Learning

We present NMT-Keras, a flexible toolkit for training deep learning mode...
research
02/09/2018

Zero-Resource Neural Machine Translation with Multi-Agent Communication Game

While end-to-end neural machine translation (NMT) has achieved notable s...
research
09/18/2017

Toward a full-scale neural machine translation in production: the Booking.com use case

While some remarkable progress has been made in neural machine translati...
research
10/18/2019

A language processing algorithm for predicting tactical solutions to an operational planning problem under uncertainty

This paper is devoted to the prediction of solutions to a stochastic dis...
research
12/10/2020

Approches quantitatives de l'analyse des prédictions en traduction automatique neuronale (TAN)

As part of a larger project on optimal learning conditions in neural mac...

Please sign up or login with your details

Forgot password? Click here to reset