Conditional Generation and Snapshot Learning in Neural Dialogue Systems

06/10/2016
by   Tsung-Hsien Wen, et al.
0

Recently a variety of LSTM-based conditional language models (LM) have been applied across a range of language generation tasks. In this work we study various model architectures and different ways to represent and aggregate the source information in an end-to-end neural dialogue system framework. A method called snapshot learning is also proposed to facilitate learning from supervised sequential signals by applying a companion cross-entropy objective function to the conditioning vector. The experimental and analytical results demonstrate firstly that competition occurs between the conditioning vector and the LM, and the differing architectures provide different trade-offs between the two. Secondly, the discriminative power and transparency of the conditioning vector is key to providing both model interpretability and better performance. Thirdly, snapshot learning leads to consistent performance improvements independent of which architecture is used.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/24/2018

The Entropy Power Inequality with quantum conditioning

The conditional Entropy Power Inequality is a fundamental inequality in ...
research
09/22/2020

Controlling Style in Generated Dialogue

Open-domain conversation models have become good at generating natural-s...
research
10/15/2020

Pretrained Language Models for Dialogue Generation with Multiple Input Sources

Large-scale pretrained language models have achieved outstanding perform...
research
08/11/2023

Task Conditioned BERT for Joint Intent Detection and Slot-filling

Dialogue systems need to deal with the unpredictability of user intents ...
research
12/08/2020

Conditional Generation of Medical Images via Disentangled Adversarial Inference

Synthetic medical image generation has a huge potential for improving he...
research
06/15/2016

Natural Language Generation as Planning under Uncertainty Using Reinforcement Learning

We present and evaluate a new model for Natural Language Generation (NLG...
research
09/21/2022

Attention Beats Concatenation for Conditioning Neural Fields

Neural fields model signals by mapping coordinate inputs to sampled valu...

Please sign up or login with your details

Forgot password? Click here to reset