Beam Search Strategies for Neural Machine Translation

02/06/2017
by   Markus Freitag, et al.
0

The basic concept in Neural Machine Translation (NMT) is to train a large Neural Network that maximizes the translation performance on a given parallel corpus. NMT is then using a simple left-to-right beam-search decoder to generate new translations that approximately maximize the trained conditional probability. The current beam search strategy generates the target sentence word by word from left-to- right while keeping a fixed amount of active candidates at each time step. First, this simple search is less adaptive as it also expands candidates whose scores are much worse than the current best. Secondly, it does not expand hypotheses if they are not within the best scoring candidates, even if their scores are close to the best one. The latter one can be avoided by increasing the beam size until no performance improvement can be observed. While you can reach better performance, this has the draw- back of a slower decoding speed. In this paper, we concentrate on speeding up the decoder by applying a more flexible beam search strategy whose candidate size may vary at each time step depending on the candidate scores. We speed up the original decoder by up to 43 Chinese-English without losing any translation quality.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/25/2018

Exploring Recombination for Efficient Decoding of Neural Machine Translation

In Neural Machine Translation (NMT), the decoder can capture the feature...
research
08/27/2019

On NMT Search Errors and Model Errors: Cat Got Your Tongue?

We report on search errors and model errors in neural machine translatio...
research
05/15/2016

Syntactically Guided Neural Machine Translation

We investigate the use of hierarchical phrase-based SMT lattices in end-...
research
10/07/2021

Beam Search with Bidirectional Strategies for Neural Response Generation

Sequence-to-sequence neural networks have been widely used in language-b...
research
07/21/2017

SGNMT -- A Flexible NMT Decoding Platform for Quick Prototyping of New Models and Search Strategies

This paper introduces SGNMT, our experimental platform for machine trans...
research
11/12/2018

Vectorization of hypotheses and speech for faster beam search in encoder decoder-based speech recognition

Attention-based encoder decoder network uses a left-to-right beam search...
research
03/01/2022

RMBR: A Regularized Minimum Bayes Risk Reranking Framework for Machine Translation

Beam search is the most widely used decoding method for neural machine t...

Please sign up or login with your details

Forgot password? Click here to reset