One2Set: Generating Diverse Keyphrases as a Set

05/24/2021
by   Jiacheng Ye, et al.
0

Recently, the sequence-to-sequence models have made remarkable progress on the task of keyphrase generation (KG) by concatenating multiple keyphrases in a predefined order as a target sequence during training. However, the keyphrases are inherently an unordered set rather than an ordered sequence. Imposing a predefined order will introduce wrong bias during training, which can highly penalize shifts in the order between keyphrases. In this work, we propose a new training paradigm One2Set without predefining an order to concatenate the keyphrases. To fit this paradigm, we propose a novel model that utilizes a fixed set of learned control codes as conditions to generate a set of keyphrases in parallel. To solve the problem that there is no correspondence between each prediction and target during training, we propose a K-step target assignment mechanism via bipartite matching, which greatly increases the diversity and reduces the duplication ratio of generated keyphrases. The experimental results on multiple benchmarks demonstrate that our approach significantly outperforms the state-of-the-art methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/31/2023

A Sequence-to-Sequence Set Model for Text-to-Table Generation

Recently, the text-to-table generation task has attracted increasing att...
research
11/01/2019

Sequence Modeling with Unconstrained Generation Order

The dominant approach to sequence generation is to produce a sequence in...
research
09/09/2019

Does Order Matter? An Empirical Study on Generating Multiple Keyphrases as a Sequence

Recently, concatenating multiple keyphrases as a target sequence has bee...
research
06/10/2018

Deconvolution-Based Global Decoding for Neural Machine Translation

A great proportion of sequence-to-sequence (Seq2Seq) models for Neural M...
research
11/13/2022

WR-ONE2SET: Towards Well-Calibrated Keyphrase Generation

Keyphrase generation aims to automatically generate short phrases summar...
research
09/24/2018

Neural Transductive Learning and Beyond: Morphological Generation in the Minimal-Resource Setting

Neural state-of-the-art sequence-to-sequence (seq2seq) models often do n...
research
10/21/2019

Learning to Make Generalizable and Diverse Predictions for Retrosynthesis

We propose a new model for making generalizable and diverse retrosynthet...

Please sign up or login with your details

Forgot password? Click here to reset