PoeLM: A Meter- and Rhyme-Controllable Language Model for Unsupervised Poetry Generation

05/24/2022
by   Aitor Ormazabal, et al.
0

Formal verse poetry imposes strict constraints on the meter and rhyme scheme of poems. Most prior work on generating this type of poetry uses existing poems for supervision, which are difficult to obtain for most languages and poetic forms. In this work, we propose an unsupervised approach to generate poems following any given meter and rhyme scheme, without requiring any poetic text for training. Our method works by splitting a regular, non-poetic corpus into phrases, prepending control codes that describe the length and end rhyme of each phrase, and training a transformer language model in the augmented corpus. During inference, we build control codes for the desired meter and rhyme scheme, and condition our language model on them to generate formal verse poetry. Experiments in Spanish and Basque show that our approach is able to generate valid poems, which are often comparable in quality to those written by humans.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/11/2019

CTRL: A Conditional Transformer Language Model for Controllable Generation

Large-scale language models show promising text generation capabilities,...
research
02/06/2020

Introducing Aspects of Creativity in Automatic Poetry Generation

Poetry Generation involves teaching systems to automatically generate te...
research
09/08/2021

Memory and Knowledge Augmented Language Models for Inferring Salience in Long-Form Stories

Measuring event salience is essential in the understanding of stories. T...
research
01/25/2023

Language Model Detoxification in Dialogue with Contextualized Stance Control

To reduce the toxic degeneration in a pretrained Language Model (LM), pr...
research
04/27/2023

SweCTRL-Mini: a data-transparent Transformer-based large language model for controllable text generation in Swedish

We present SweCTRL-Mini, a large Swedish language model that can be used...
research
09/15/2021

Transformer-based Lexically Constrained Headline Generation

This paper explores a variant of automatic headline generation methods, ...
research
11/10/2022

Nano: Nested Human-in-the-Loop Reward Learning for Few-shot Language Model Control

Pretrained language models have demonstrated extraordinary capabilities ...

Please sign up or login with your details

Forgot password? Click here to reset