Deep learning and sub-word-unit approach in written art generation

01/22/2019
by   Krzysztof Wołk, et al.
0

Automatic poetry generation is novel and interesting application of natural language processing research. It became more popular during the last few years due to the rapid development of technology and neural computing power. This line of research can be applied to the study of linguistics and literature, for social science experiments, or simply for entertainment. The most effective known method of artificial poem generation uses recurrent neural networks (RNN). We also used RNNs to generate poems in the style of Adam Mickiewicz. Our network was trained on the Sir Thaddeus poem. For data pre-processing, we used a specialized stemming tool, which is one of the major innovations and contributions of this work. Our experiment was conducted on the source text, divided into sub-word units (at a level of resolution close to syllables). This approach is novel and is not often employed in the published literature. The subwords units seem to be a natural choice for analysis of the Polish language, as the language is morphologically rich due to cases, gender forms and a large vocabulary. Moreover, Sir Thaddeus contains rhymes, so the analysis of syllables can be meaningful. We verified our model with different settings for the temperature parameter, which controls the randomness of the generated text. We also compared our results with similar models trained on the same text but divided into characters (which is the most common approach alongside the use of full word units). The differences were tremendous. Our solution generated much better poems that were able to follow the metre and vocabulary of the source data text.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/30/2019

TS-RNN: Text Steganalysis Based on Recurrent Neural Networks

With the rapid development of natural language processing technologies, ...
research
06/17/2019

Tabula nearly rasa: Probing the Linguistic Knowledge of Character-Level Neural Language Models Trained on Unsegmented Text

Recurrent neural networks (RNNs) have reached striking performance in ma...
research
07/04/2017

Automatic Generation of Natural Language Explanations

An important task for recommender system is to generate explanations acc...
research
06/09/2021

Case Studies on using Natural Language Processing Techniques in Customer Relationship Management Software

How can a text corpus stored in a customer relationship management (CRM)...
research
10/25/2018

Bayesian Compression for Natural Language Processing

In natural language processing, a lot of the tasks are successfully solv...
research
12/20/2021

Between words and characters: A Brief History of Open-Vocabulary Modeling and Tokenization in NLP

What are the units of text that we want to model? From bytes to multi-wo...
research
07/25/2021

Denoising and Segmentation of Epigraphical Scripts

This paper is a presentation of a new method for denoising images using ...

Please sign up or login with your details

Forgot password? Click here to reset