Lexically Cohesive Neural Machine Translation with Copy Mechanism

10/11/2020
by   Vipul Mishra, et al.
4

Lexically cohesive translations preserve consistency in word choices in document-level translation. We employ a copy mechanism into a context-aware neural machine translation model to allow copying words from previous translation outputs. Different from previous context-aware neural machine translation models that handle all the discourse phenomena implicitly, our model explicitly addresses the lexical cohesion problem by boosting the probabilities to output words consistently. We conduct experiments on Japanese to English translation using an evaluation dataset for discourse translation. The results showed that the proposed model significantly improved lexical cohesion compared to previous context-aware models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/17/2023

HanoiT: Enhancing Context-aware Translation via Selective Context

Context-aware neural machine translation aims to use the document-level ...
research
10/30/2019

Fill in the Blanks: Imputing Missing Sentences for Larger-Context Neural Machine Translation

Most neural machine translation systems still translate sentences in iso...
research
08/07/2023

Negative Lexical Constraints in Neural Machine Translation

This paper explores negative lexical constraining in English to Czech ne...
research
06/07/2021

Encouraging Neural Machine Translation to Satisfy Terminology Constraints

We present a new approach to encourage neural machine translation to sat...
research
03/12/2019

Context-Aware Learning for Neural Machine Translation

Interest in larger-context neural machine translation, including documen...
research
01/31/2018

Paraphrase-Supervised Models of Compositionality

Compositional vector space models of meaning promise new solutions to st...

Please sign up or login with your details

Forgot password? Click here to reset