Variational Memory Encoder-Decoder

07/26/2018
by   Hung Le, et al.
0

Introducing variability while maintaining coherence is a core task in learning to generate utterances in conversation. Standard neural encoder-decoder models and their extensions using conditional variational autoencoder often result in either trivial or digressive responses. To overcome this, we explore a novel approach that injects variability into neural encoder-decoder via the use of external memory as a mixture model, namely Variational Memory Encoder-Decoder (VMED). By associating each memory read with a mode in the latent mixture distribution at each timestep, our model can capture the variability observed in sequential data such as natural conversations. We empirically compare the proposed model against other recent approaches on various conversational datasets. The results show that VMED consistently achieves significant improvement over others in both metric-based and qualitative evaluations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/20/2020

On the Encoder-Decoder Incompatibility in Variational Text Modeling and Beyond

Variational autoencoders (VAEs) combine latent variables with amortized ...
research
10/07/2022

A Unified Encoder-Decoder Framework with Entity Memory

Entities, as important carriers of real-world knowledge, play a key role...
research
03/31/2017

Learning Discourse-level Diversity for Neural Dialog Models using Conditional Variational Autoencoders

While recent neural encoder-decoder models have shown great promise in m...
research
12/31/2016

Cutting-off Redundant Repeating Generations for Neural Abstractive Summarization

This paper tackles the reduction of redundant repeating generation that ...
research
02/21/2022

Deep Residual Inception Encoder-Decoder Network for Amyloid PET Harmonization

Introduction Multiple positron emission tomography (PET) tracers are av...
research
10/21/2020

Learning to Summarize Long Texts with Memory Compression and Transfer

We introduce Mem2Mem, a memory-to-memory mechanism for hierarchical recu...
research
06/01/2023

Hierarchical Attention Encoder Decoder

Recent advances in large language models have shown that autoregressive ...

Please sign up or login with your details

Forgot password? Click here to reset