An Adversarial Non-Autoregressive Model for Text Generation with Incomplete Information

05/06/2023
by   Da Ren, et al.
0

Non-autoregressive models have been widely studied in the Complete Information Scenario (CIS), in which the models have complete input information to obtain corresponding output. However, their explorations in the Incomplete Information Scenario (IIS) are extremely limited. Our analyses reveal that the IIS's incomplete input information will augment the inherent limitations of existing non-autoregressive models trained under Maximum Likelihood Estimation. In this paper, we propose for the IIS an Adversarial Non-autoregressive Transformer (ANT) which has two novel features: 1) Position Aware Self-Modulation to provide more reasonable hidden representations, and 2) Dependency Feed Forward Network to strengthen its capacity in dependency modeling. We compare ANT with other mainstream models in the IIS and demonstrate that ANT can achieve comparable performance with much fewer decoding iterations. Furthermore, we show its great potential in various applications like latent interpolation and semi-supervised learning.

READ FULL TEXT

page 3

page 5

research
11/25/2019

Non-autoregressive Transformer by Position Learning

Non-autoregressive models are promising on various text generation tasks...
research
03/21/2021

Non-Autoregressive Translation by Learning Target Categorical Codes

Non-autoregressive Transformer is a promising text generation model. How...
research
10/19/2022

Autoregressive Generative Modeling with Noise Conditional Maximum Likelihood Estimation

We introduce a simple modification to the standard maximum likelihood es...
research
10/21/2021

Improving Non-autoregressive Generation with Mixup Training

While pre-trained language models have achieved great success on various...
research
08/26/2022

Nearest Neighbor Non-autoregressive Text Generation

Non-autoregressive (NAR) models can generate sentences with less computa...
research
08/30/2019

Autoregressive Text Generation Beyond Feedback Loops

Autoregressive state transitions, where predictions are conditioned on p...
research
06/13/2022

On the Learning of Non-Autoregressive Transformers

Non-autoregressive Transformer (NAT) is a family of text generation mode...

Please sign up or login with your details

Forgot password? Click here to reset