Improved Training of Mixture-of-Experts Language GANs

02/23/2023
by   Yekun Chai, et al.
0

Despite the dramatic success in image generation, Generative Adversarial Networks (GANs) still face great challenges in synthesizing sequences of discrete elements, in particular human language. The difficulty in generator training arises from the limited representation capacity and uninformative learning signals obtained from the discriminator. In this work, we (1) first empirically show that the mixture-of-experts approach is able to enhance the representation capacity of the generator for language GANs and (2) harness the Feature Statistics Alignment (FSA) paradigm to render fine-grained learning signals to advance the generator training. Specifically, FSA forces the mean statistics of the distribution of fake data to approach that of real samples as close as possible in the finite-dimensional feature space. Empirical study on synthetic and real benchmarks shows the superior performance in quantitative evaluation and demonstrates the effectiveness of our approach to adversarial text generation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/12/2020

Improving GAN Training with Probability Ratio Clipping and Sample Reweighting

Despite success on a wide range of problems related to vision, generativ...
research
06/05/2017

Language Generation with Recurrent Generative Adversarial Networks without Pre-training

Generative Adversarial Networks (GANs) have shown great promise recently...
research
01/28/2022

Generative Cooperative Networks for Natural Language Generation

Generative Adversarial Networks (GANs) have known a tremendous success f...
research
05/31/2017

Adversarial Ranking for Language Generation

Generative adversarial networks (GANs) have great successes on synthesiz...
research
10/11/2018

Adversarial Text Generation Without Reinforcement Learning

Generative Adversarial Networks (GANs) have experienced a recent surge i...
research
08/22/2018

TreeGAN: Syntax-Aware Sequence Generation with Generative Adversarial Networks

Generative Adversarial Networks (GANs) have shown great capacity on imag...
research
02/21/2018

Globally Consistent Algorithms for Mixture of Experts

Mixture-of-Experts (MoE) is a widely popular neural network architecture...

Please sign up or login with your details

Forgot password? Click here to reset