Encoding Sentence Position in Context-Aware Neural Machine Translation with Concatenation

02/13/2023
by   Lorenzo Lupo, et al.
5

Context-aware translation can be achieved by processing a concatenation of consecutive sentences with the standard translation approach. This paper investigates the intuitive idea of adopting segment embeddings for this task to help the Transformer discern the position of each sentence in the concatenation sequence. We compare various segment embeddings and propose novel methods to encode sentence position into token representations, showing that they do not benefit the vanilla concatenation approach except in a specific setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/24/2022

Focused Concatenation for Context-Aware Neural Machine Translation

A straightforward approach to context-aware neural machine translation c...
research
03/21/2019

Selective Attention for Context-aware Neural Machine Translation

Despite the progress made in sentence-level NMT, current systems still f...
research
09/07/2021

Revisiting Context Choices for Context-aware Machine Translation

One of the most popular methods for context-aware machine translation (M...
research
10/19/2022

A baseline revisited: Pushing the limits of multi-segment models for context-aware translation

This paper addresses the task of contextual translation using multi-segm...
research
05/25/2018

Context-Aware Neural Machine Translation Learns Anaphora Resolution

Standard machine translation systems process sentences in isolation and ...
research
04/18/2021

Demystifying the Better Performance of Position Encoding Variants for Transformer

Transformers are state of the art models in NLP that map a given input s...
research
01/24/2021

Longest segment of balanced parentheses – an exercise in program inversion in a segment problem (Functional Pearl)

Given a string of parentheses, the task is to find a longest consecutive...

Please sign up or login with your details

Forgot password? Click here to reset