Monotonic Infinite Lookback Attention for Simultaneous Machine Translation

06/12/2019
by   Naveen Arivazhagan, et al.
0

Simultaneous machine translation begins to translate each source sentence before the source speaker is finished speaking, with applications to live and streaming scenarios. Simultaneous systems must carefully schedule their reading of the source sentence to balance quality against latency. We present the first simultaneous translation system to learn an adaptive schedule jointly with a neural machine translation (NMT) model that attends over all source tokens read thus far. We do so by introducing Monotonic Infinite Lookback (MILk) attention, which maintains both a hard, monotonic attention head to schedule the reading of the source sentence, and a soft attention head that extends from the monotonic head back to the beginning of the source. We show that MILk's adaptive schedule allows it to arrive at latency-quality trade-offs that are favorable to those of a recently proposed wait-k strategy for many latency values.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/31/2019

Thinking Slow about Latency Evaluation for Simultaneous Machine Translation

Simultaneous machine translation attempts to translate a source sentence...
research
09/26/2019

Monotonic Multihead Attention

Simultaneous machine translation models start generating a target sequen...
research
08/31/2021

SimulLR: Simultaneous Lip Reading Transducer with Attention-Guided Adaptive Memory

Lip reading, aiming to recognize spoken sentences according to the given...
research
01/30/2022

Anticipation-free Training for Simultaneous Translation

Simultaneous translation (SimulMT) speeds up the translation process by ...
research
05/22/2023

Learning Optimal Policy for Simultaneous Machine Translation via Binary Search

Simultaneous machine translation (SiMT) starts to output translation whi...
research
09/07/2021

Infusing Future Information into Monotonic Attention Through Language Models

Simultaneous neural machine translation(SNMT) models start emitting the ...
research
02/22/2021

Exploiting Multimodal Reinforcement Learning for Simultaneous Machine Translation

This paper addresses the problem of simultaneous machine translation (Si...

Please sign up or login with your details

Forgot password? Click here to reset