Text Generation with Deep Variational GAN

04/27/2021
by   Mahmoud Hossam, et al.
0

Generating realistic sequences is a central task in many machine learning applications. There has been considerable recent progress on building deep generative models for sequence generation tasks. However, the issue of mode-collapsing remains a main issue for the current models. In this paper we propose a GAN-based generic framework to address the problem of mode-collapse in a principled approach. We change the standard GAN objective to maximize a variational lower-bound of the log-likelihood while minimizing the Jensen-Shanon divergence between data and model distributions. We experiment our model with text generation task and show that it can generate realistic text with high diversity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/17/2018

Adversarial Text Generation via Feature-Mover's Distance

Generative adversarial networks (GANs) have achieved significant success...
research
03/15/2018

Neural Text Generation: Past, Present and Beyond

This paper presents a systematic survey on recent development of neural ...
research
05/30/2019

Adversarial Sub-sequence for Text Generation

Generative adversarial nets (GAN) has been successfully introduced for g...
research
08/02/2023

Feature-aware conditional GAN for category text generation

Category text generation receives considerable attentions since it is be...
research
12/01/2017

Text Generation Based on Generative Adversarial Nets with Latent Variable

In this paper, we propose a model using generative adversarial net (GAN)...
research
12/16/2020

StrokeGAN: Reducing Mode Collapse in Chinese Font Generation via Stroke Encoding

The generation of stylish Chinese fonts is an important problem involved...
research
06/12/2018

Deep State Space Models for Unconditional Word Generation

Autoregressive feedback is considered a necessity for successful uncondi...

Please sign up or login with your details

Forgot password? Click here to reset