BSDAR: Beam Search Decoding with Attention Reward in Neural Keyphrase Generation

09/17/2019
by   Iftitahu Ni'mah, et al.
0

This study mainly investigates two decoding problems in neural keyphrase generation: sequence length bias and beam diversity. We introduce an extension of beam search inference based on word-level and n-gram level attention score to adjust and constrain Seq2Seq prediction at test time. Results show that our proposed solution can overcome the algorithm bias to shorter and nearly identical sequences, resulting in a significant improvement of the decoding performance on generating keyphrases that are present and absent in source text.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/29/2022

On Decoding Strategies for Neural Text Generators

When generating text from probabilistic models, the chosen decoding stra...
research
07/08/2020

Best-First Beam Search

Decoding for many NLP tasks requires a heuristic algorithm for approxima...
research
11/01/2018

Learning Beam Search Policies via Imitation Learning

Beam search is widely used for approximate decoding in structured predic...
research
06/14/2021

Determinantal Beam Search

Beam search is a go-to strategy for decoding neural sequence models. The...
research
10/07/2016

Diverse Beam Search: Decoding Diverse Solutions from Neural Sequence Models

Neural sequence models are widely used to model time-series data in many...
research
04/30/2018

Accelerating NMT Batched Beam Decoding with LMBR Posteriors for Deployment

We describe a batched beam decoding algorithm for NMT with LMBR n-gram p...
research
08/01/2017

A Continuous Relaxation of Beam Search for End-to-end Training of Neural Sequence Models

Beam search is a desirable choice of test-time decoding algorithm for ne...

Please sign up or login with your details

Forgot password? Click here to reset