Topic Aware Neural Response Generation

06/21/2016
by   Chen Xing, et al.
0

We consider incorporating topic information into the sequence-to-sequence framework to generate informative and interesting responses for chatbots. To this end, we propose a topic aware sequence-to-sequence (TA-Seq2Seq) model. The model utilizes topics to simulate prior knowledge of human that guides them to form informative and interesting responses in conversation, and leverages the topic information in generation by a joint attention mechanism and a biased generation probability. The joint attention mechanism summarizes the hidden vectors of an input message as context vectors by message attention, synthesizes topic vectors by topic attention from the topic words of the message obtained from a pre-trained LDA model, and let these vectors jointly affect the generation of words in decoding. To increase the possibility of topic words appearing in responses, the model modifies the generation probability of topic words by adding an extra probability item to bias the overall distribution. Empirical study on both automatic evaluation metrics and human annotations shows that TA-Seq2Seq can generate more informative and interesting responses, and significantly outperform the-state-of-the-art response generation models.

READ FULL TEXT
research
04/30/2016

Response Selection with Topic Clues for Retrieval-based Chatbots

We consider incorporating topic information into message-response matchi...
research
09/11/2021

TopicRefine: Joint Topic Prediction and Dialogue Response Generation for Multi-turn End-to-End Dialogue System

A multi-turn dialogue always follows a specific topic thread, and topic ...
research
09/19/2018

Latent Topic Conversational Models

Latent variable models have been a preferred choice in conversational mo...
research
11/02/2018

Augmenting Neural Response Generation with Context-Aware Topical Attention

Sequence-to-Sequence (Seq2Seq) models have witnessed a notable success i...
research
11/30/2017

Neural Response Generation with Dynamic Vocabularies

We study response generation for open domain conversation in chatbots. E...
research
11/17/2018

An Affect-Rich Neural Conversational Model with Biased Attention and Weighted Cross-Entropy Loss

Affect conveys important implicit information in human communication. Ha...
research
01/11/2017

Generating High-Quality and Informative Conversation Responses with Sequence-to-Sequence Models

Sequence-to-sequence models have been applied to the conversation respon...

Please sign up or login with your details

Forgot password? Click here to reset