Plug-and-Play Recipe Generation with Content Planning

12/09/2022
by   Yinhong Liu, et al.
0

Recent pre-trained language models have shown promising capabilities in generating fluent and realistic natural language text. However, generating multi-sentence text with global content planning has been a long-existing research question. Current approaches for controlled text generation can hardly address this issue, as they usually condition on single known control attributes. In this study, we propose a low-cost yet effective framework which explicitly models the global content plan of the generated text. Specifically, it optimizes the joint distribution of the natural language sequence and the global content plan in a plug-and-play manner. We conduct extensive experiments on the well-established Recipe1M+ benchmark. Both automatic and human evaluations verify that our model achieves the state-of-the-art performance on the task of recipe generation

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/07/2022

Plot Writing From Pre-Trained Language Models

Pre-trained language models (PLMs) fail to generate long-form narrative ...
research
05/09/2021

Knowledge-based Review Generation by Coherence Enhanced Text Planning

As a natural language generation task, it is challenging to generate inf...
research
05/24/2023

Revisiting Sentence Union Generation as a Testbed for Text Consolidation

Tasks involving text generation based on multiple input texts, such as m...
research
12/08/2020

Facts2Story: Controlling Text Generation by Key Facts

Recent advancements in self-attention neural network architectures have ...
research
08/31/2021

Plan-then-Generate: Controlled Data-to-Text Generation via Planning

Recent developments in neural networks have led to the advance in data-t...
research
04/28/2023

Text-Blueprint: An Interactive Platform for Plan-based Conditional Generation

While conditional generation models can now generate natural language we...
research
06/05/2020

CoCon: A Self-Supervised Approach for Controlled Text Generation

Pretrained Transformer-based language models (LMs) display remarkable na...

Please sign up or login with your details

Forgot password? Click here to reset