Reinforcement Learning for on-line Sequence Transformation

05/28/2021
by   Grzegorz Rypeść, et al.
1

A number of problems in the processing of sound and natural language, as well as in other areas, can be reduced to simultaneously reading an input sequence and writing an output sequence of generally different length. There are well developed methods that produce the output sequence based on the entirely known input. However, efficient methods that enable such transformations on-line do not exist. In this paper we introduce an architecture that learns with reinforcement to make decisions about whether to read a token or write another token. This architecture is able to transform potentially infinite sequences on-line. In an experimental study we compare it with state-of-the-art methods for neural machine translation. While it produces slightly worse translations than Transformer, it outperforms the autoencoder with attention, even though our architecture translates texts on-line thereby solving a more difficult problem than both reference methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/27/2019

Simultaneous Neural Machine Translation using Connectionist Temporal Classification

Simultaneous machine translation is a variant of machine translation tha...
research
11/30/2022

Rephrasing the Reference for Non-Autoregressive Machine Translation

Non-autoregressive neural machine translation (NAT) models suffer from t...
research
05/18/2020

Efficient Wait-k Models for Simultaneous Machine Translation

Simultaneous machine translation consists in starting output generation ...
research
10/23/2019

Controlling the Output Length of Neural Machine Translation

The recent advances introduced by neural machine translation (NMT) are r...
research
08/21/2020

Neural Machine Translation without Embeddings

Many NLP models follow the embed-contextualize-predict paradigm, in whic...
research
09/07/2021

Infusing Future Information into Monotonic Attention Through Language Models

Simultaneous neural machine translation(SNMT) models start emitting the ...

Please sign up or login with your details

Forgot password? Click here to reset