DeepAI AI Chat
Log In Sign Up

Plug-and-Play Recipe Generation with Content Planning

by   Yinhong Liu, et al.
University of Cambridge
Monash University

Recent pre-trained language models have shown promising capabilities in generating fluent and realistic natural language text. However, generating multi-sentence text with global content planning has been a long-existing research question. Current approaches for controlled text generation can hardly address this issue, as they usually condition on single known control attributes. In this study, we propose a low-cost yet effective framework which explicitly models the global content plan of the generated text. Specifically, it optimizes the joint distribution of the natural language sequence and the global content plan in a plug-and-play manner. We conduct extensive experiments on the well-established Recipe1M+ benchmark. Both automatic and human evaluations verify that our model achieves the state-of-the-art performance on the task of recipe generation


page 1

page 2

page 3

page 4


Plot Writing From Pre-Trained Language Models

Pre-trained language models (PLMs) fail to generate long-form narrative ...

Knowledge-based Review Generation by Coherence Enhanced Text Planning

As a natural language generation task, it is challenging to generate inf...

Revisiting Sentence Union Generation as a Testbed for Text Consolidation

Tasks involving text generation based on multiple input texts, such as m...

Facts2Story: Controlling Text Generation by Key Facts

Recent advancements in self-attention neural network architectures have ...

Plan-then-Generate: Controlled Data-to-Text Generation via Planning

Recent developments in neural networks have led to the advance in data-t...

Text-Blueprint: An Interactive Platform for Plan-based Conditional Generation

While conditional generation models can now generate natural language we...

CoCon: A Self-Supervised Approach for Controlled Text Generation

Pretrained Transformer-based language models (LMs) display remarkable na...