Deep Latent-Variable Models for Text Generation

03/03/2022
by   Xiaoyu Shen, et al.
0

Text generation aims to produce human-like natural language output for down-stream tasks. It covers a wide range of applications like machine translation, document summarization, dialogue generation and so on. Recently deep neural network-based end-to-end architectures have been widely adopted. The end-to-end approach conflates all sub-modules, which used to be designed by complex handcrafted rules, into a holistic encode-decode architecture. Given enough training data, it is able to achieve state-of-the-art performance yet avoiding the need of language/domain-dependent knowledge. Nonetheless, deep learning models are known to be extremely data-hungry, and text generated from them usually suffer from low diversity, interpretability and controllability. As a result, it is difficult to trust the output from them in real-life applications. Deep latent-variable models, by specifying the probabilistic distribution over an intermediate latent process, provide a potential way of addressing these problems while maintaining the expressive power of deep neural networks. This dissertation presents how deep latent-variable models can improve over the standard encoder-decoder model for text generation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/27/2017

Neural Text Generation: A Practical Guide

Deep learning methods have recently achieved great empirical success on ...
research
08/23/2019

Neural data-to-text generation: A comparison between pipeline and end-to-end architectures

Traditionally, most data-to-text applications have been designed using a...
research
02/20/2019

Mixture Models for Diverse Machine Translation: Tricks of the Trade

Mixture models trained via EM are among the simplest, most widely used a...
research
08/30/2019

Implicit Deep Latent Variable Models for Text Generation

Deep latent variable models (LVM) such as variational auto-encoder (VAE)...
research
05/10/2020

Posterior Control of Blackbox Generation

Text generation often requires high-precision output that obeys task-spe...
research
01/07/2020

Paraphrase Generation with Latent Bag of Words

Paraphrase generation is a longstanding important problem in natural lan...
research
11/10/2018

Dual Latent Variable Model for Low-Resource Natural Language Generation in Dialogue Systems

Recent deep learning models have shown improving results to natural lang...

Please sign up or login with your details

Forgot password? Click here to reset