MOCHA: A Multi-Task Training Approach for Coherent Text Generation from Cognitive Perspective

10/26/2022
by   Zhe Hu, et al.
0

Teaching neural models to generate narrative coherent texts is a critical problem. Recent pre-trained language models have achieved promising results, but there is still a gap between human written texts and machine-generated outputs. In this work, we propose a novel multi-task training strategy for coherent text generation grounded on the cognitive theory of writing, which empowers the model to learn essential subskills needed for writing including planning and reviewing besides end-to-end generation. We extensively evaluate our model on three open-ended generation tasks including story generation, news article writing and argument generation. Experiments show that our model achieves better results on both few-shot and fully-supervised settings than strong baselines, and human evaluations confirm that our model can generate more coherent outputs.

READ FULL TEXT
research
10/07/2022

Visualize Before You Write: Imagination-Guided Open-Ended Text Generation

Recent advances in text-to-image synthesis make it possible to visualize...
research
09/29/2022

Co-Writing Screenplays and Theatre Scripts with Language Models: An Evaluation by Industry Professionals

Language models are increasingly attracting interest from writers. Howev...
research
06/01/2021

DYPLOC: Dynamic Planning of Content Using Mixed Language Models for Text Generation

We study the task of long-form opinion text generation, which faces at l...
research
05/12/2023

Unsupervised Melody-Guided Lyrics Generation

Automatic song writing is a topic of significant practical interest. How...
research
07/12/2021

CatVRNN: Generating Category Texts via Multi-task Learning

Controlling the model to generate texts of different categories is a cha...
research
08/15/2023

Teach LLMs to Personalize – An Approach inspired by Writing Education

Personalized text generation is an emerging research area that has attra...
research
05/22/2023

Look-back Decoding for Open-Ended Text Generation

Given a prefix (context), open-ended generation aims to decode texts tha...

Please sign up or login with your details

Forgot password? Click here to reset