Reducing Position Bias in Simultaneous Machine Translation with Length-Aware Framework

03/17/2022
by   Shaolei Zhang, et al.
0

Simultaneous machine translation (SiMT) starts translating while receiving the streaming source inputs, and hence the source sentence is always incomplete during translating. Different from the full-sentence MT using the conventional seq-to-seq architecture, SiMT often applies prefix-to-prefix architecture, which forces each target word to only align with a partial source prefix to adapt to the incomplete source in streaming inputs. However, the source words in the front positions are always illusoryly considered more important since they appear in more prefixes, resulting in position bias, which makes the model pay more attention on the front source positions in testing. In this paper, we first analyze the phenomenon of position bias in SiMT, and develop a Length-Aware Framework to reduce the position bias by bridging the structural gap between SiMT and full-sentence MT. Specifically, given the streaming inputs, we first predict the full-sentence length and then fill the future source position with positional encoding, thereby turning the streaming inputs into a pseudo full-sentence. The proposed framework can be integrated into most existing SiMT methods to further improve performance. Experiments on two representative SiMT methods, including the state-of-the-art adaptive policy, show that our method successfully reduces the position bias and thereby achieves better SiMT performance.

READ FULL TEXT
research
03/04/2022

From Simultaneous to Streaming Machine Translation by Leveraging Streaming History

Simultaneous Machine Translation is the task of incrementally translatin...
research
03/17/2022

Gaussian Multi-head Attention for Simultaneous Machine Translation

Simultaneous machine translation (SiMT) outputs translation while receiv...
research
09/12/2023

Glancing Future for Simultaneous Machine Translation

Simultaneous machine translation (SiMT) outputs translation while readin...
research
04/18/2021

Stream-level Latency Evaluation for Simultaneous Machine Translation

Simultaneous machine translation has recently gained traction thanks to ...
research
08/15/2019

Transformer-based Automatic Post-Editing with a Context-Aware Encoding Approach for Multi-Source Inputs

Recent approaches to the Automatic Post-Editing (APE) research have show...
research
04/07/2020

Re-translation versus Streaming for Simultaneous Translation

There has been great progress in improving streaming machine translation...
research
10/19/2022

A Continuum of Generation Tasks for Investigating Length Bias and Degenerate Repetition

Language models suffer from various degenerate behaviors. These differ b...

Please sign up or login with your details

Forgot password? Click here to reset