Exploring the Use of Attention within an Neural Machine Translation Decoder States to Translate Idioms

10/10/2018
by   Giancarlo D. Salton, et al.
0

Idioms pose problems to almost all Machine Translation systems. This type of language is very frequent in day-to-day language use and cannot be simply ignored. The recent interest in memory augmented models in the field of Language Modelling has aided the systems to achieve good results by bridging long-distance dependencies. In this paper we explore the use of such techniques into a Neural Machine Translation system to help in translation of idiomatic language.

READ FULL TEXT
research
03/16/2018

Tensor2Tensor for Neural Machine Translation

Tensor2Tensor is a library for deep learning models that is well-suited ...
research
06/09/2019

Happy Together: Learning and Understanding Appraisal From Natural Language

In this paper, we explore various approaches for learning two types of a...
research
10/26/2015

Empirical Study on Deep Learning Models for Question Answering

In this paper we explore deep learning models with memory component or a...
research
05/18/2019

A Case Study: Exploiting Neural Machine Translation to Translate CUDA to OpenCL

The sequence-to-sequence (seq2seq) model for neural machine translation ...
research
07/08/2021

Using CollGram to Compare Formulaic Language in Human and Neural Machine Translation

A comparison of formulaic sequences in human and neural machine translat...
research
10/20/2020

Towards End-to-End In-Image Neural Machine Translation

In this paper, we offer a preliminary investigation into the task of in-...

Please sign up or login with your details

Forgot password? Click here to reset