Steering Output Style and Topic in Neural Response Generation

09/09/2017
by   Di Wang, et al.
0

We propose simple and flexible training and decoding methods for influencing output style and topic in neural encoder-decoder based language generation. This capability is desirable in a variety of applications, including conversational systems, where successful agents need to produce language in a specific style and generate responses steered by a human puppeteer or external knowledge. We decompose the neural generation process into empirically easier sub-problems: a faithfulness model and a decoding method based on selective-sampling. We also describe training and sampling algorithms that bias the generation process with a specific language style restriction, or a topic restriction. Human evaluation results show that our proposed methods are able to restrict style and topic without degrading output quality in conversational tasks.

READ FULL TEXT
research
01/09/2017

Neural Personalized Response Generation as Domain Adaptation

In this paper, we focus on the personalized response generation for conv...
research
09/19/2018

Latent Topic Conversational Models

Latent variable models have been a preferred choice in conversational mo...
research
03/07/2021

Empathetic BERT2BERT Conversational Model: Learning Arabic Language Generation with Little Data

Enabling empathetic behavior in Arabic dialogue agents is an important a...
research
10/07/2022

Unsupervised Neural Stylistic Text Generation using Transfer learning and Adapters

Research has shown that personality is a key driver to improve engagemen...
research
04/11/2018

SHAPED: Shared-Private Encoder-Decoder for Text Style Adaptation

Supervised training of abstractive language generation models results in...
research
09/05/2018

Neural MultiVoice Models for Expressing Novel Personalities in Dialog

Natural language generators for task-oriented dialog should be able to v...
research
04/16/2019

Positional Encoding to Control Output Sequence Length

Neural encoder-decoder models have been successful in natural language g...

Please sign up or login with your details

Forgot password? Click here to reset