DeepAI AI Chat
Log In Sign Up

Controlling Linguistic Style Aspects in Neural Language Generation

by   Jessica Ficler, et al.

Most work on neural natural language generation (NNLG) focus on controlling the content of the generated text. We experiment with controlling several stylistic aspects of the generated text, in addition to its content. The method is based on conditioned RNN language model, where the desired content as well as the stylistic parameters serve as conditioning contexts. We demonstrate the approach on the movie reviews domain and show that it is successful in generating coherent sentences corresponding to the required linguistic style and content.


page 1

page 2

page 3

page 4


Why is constrained neural language generation particularly challenging?

Recent advances in deep neural language models combined with the capacit...

Low-Level Linguistic Controls for Style Transfer and Content Preservation

Despite the success of style transfer in image processing, it has seen l...

Evaluating Creative Language Generation: The Case of Rap Lyric Ghostwriting

Language generation tasks that seek to mimic human ability to use langua...

Learning to Write with Cooperative Discriminators

Recurrent Neural Networks (RNNs) are powerful autoregressive sequence mo...

Modeling Coherency in Generated Emails by Leveraging Deep Neural Learners

Advanced machine learning and natural language techniques enable attacke...

Maximizing Stylistic Control and Semantic Accuracy in NLG: Personality Variation and Discourse Contrast

Neural generation methods for task-oriented dialogue typically generate ...

A Text Reassembling Approach to NaturalLanguage Generation

Recent years have seen a number of proposals for performing Natural Lang...