Latent Topic Conversational Models

09/19/2018
by   Tsung-Hsien Wen, et al.
0

Latent variable models have been a preferred choice in conversational modeling compared to sequence-to-sequence (seq2seq) models which tend to generate generic and repetitive responses. Despite so, training latent variable models remains to be difficult. In this paper, we propose Latent Topic Conversational Model (LTCM) which augments seq2seq with a neural latent topic component to better guide response generation and make training easier. The neural topic component encodes information from the source sentence to build a global "topic" distribution over words, which is then consulted by the seq2seq model at each generation step. We study in details how the latent representation is learnt in both the vanilla model and LTCM. Our extensive experiments contribute to better understanding and training of conditional latent models for languages. Our results show that by sampling from the learnt latent representations, LTCM can generate diverse and interesting responses. In a subjective human evaluation, the judges also confirm that LTCM is the overall preferred option.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/25/2019

Deep Conversational Recommender in Travel

When traveling to a foreign country, we are often in dire need of an int...
research
06/21/2016

Topic Aware Neural Response Generation

We consider incorporating topic information into the sequence-to-sequenc...
research
09/09/2017

Steering Output Style and Topic in Neural Response Generation

We propose simple and flexible training and decoding methods for influen...
research
10/17/2022

Sequential Topic Selection Model with Latent Variable for Topic-Grounded Dialogue

Recently, topic-grounded dialogue system has attracted significant atten...
research
12/18/2017

Multilingual Topic Models

Scientific publications have evolved several features for mitigating voc...
research
10/23/2020

Generating Long Financial Report using Conditional Variational Autoencoders with Knowledge Distillation

Automatically generating financial report from a piece of news is quite ...
research
11/08/2016

The Neural Noisy Channel

We formulate sequence to sequence transduction as a noisy channel decodi...

Please sign up or login with your details

Forgot password? Click here to reset