DeepAI AI Chat
Log In Sign Up

Controlling Linguistic Style Aspects in Neural Language Generation

07/09/2017
by   Jessica Ficler, et al.
0

Most work on neural natural language generation (NNLG) focus on controlling the content of the generated text. We experiment with controlling several stylistic aspects of the generated text, in addition to its content. The method is based on conditioned RNN language model, where the desired content as well as the stylistic parameters serve as conditioning contexts. We demonstrate the approach on the movie reviews domain and show that it is successful in generating coherent sentences corresponding to the required linguistic style and content.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/11/2022

Why is constrained neural language generation particularly challenging?

Recent advances in deep neural language models combined with the capacit...
11/08/2019

Low-Level Linguistic Controls for Style Transfer and Content Preservation

Despite the success of style transfer in image processing, it has seen l...
12/09/2016

Evaluating Creative Language Generation: The Case of Rap Lyric Ghostwriting

Language generation tasks that seek to mimic human ability to use langua...
05/16/2018

Learning to Write with Cooperative Discriminators

Recurrent Neural Networks (RNNs) are powerful autoregressive sequence mo...
07/14/2020

Modeling Coherency in Generated Emails by Leveraging Deep Neural Learners

Advanced machine learning and natural language techniques enable attacke...
07/22/2019

Maximizing Stylistic Control and Semantic Accuracy in NLG: Personality Variation and Discourse Contrast

Neural generation methods for task-oriented dialogue typically generate ...
05/16/2020

A Text Reassembling Approach to NaturalLanguage Generation

Recent years have seen a number of proposals for performing Natural Lang...