MacLaSa: Multi-Aspect Controllable Text Generation via Efficient Sampling from Compact Latent Space

05/22/2023
by   Hanxing Ding, et al.
0

Multi-aspect controllable text generation aims to generate fluent sentences that possess multiple desired attributes simultaneously. Traditional methods either combine many operators in the decoding stage, often with costly iteration or search in the discrete text space, or train separate controllers for each aspect, resulting in a degeneration of text quality due to the discrepancy between different aspects. To address these limitations, we introduce a novel approach for multi-aspect control, namely MacLaSa, that estimates compact latent space for multiple aspects and performs efficient sampling with a robust sampler based on ordinary differential equations (ODEs). To eliminate the domain gaps between different aspects, we utilize a Variational Autoencoder (VAE) network to map text sequences from varying data sources into close latent representations. The estimated latent space enables the formulation of joint energy-based models (EBMs) and the plugging in of arbitrary attribute discriminators to achieve multi-aspect control. Afterwards, we draw latent vector samples with an ODE-based sampler and feed sampled examples to the VAE decoder to produce target text sequences. Experimental results demonstrate that MacLaSa outperforms several strong baselines on attribute relevance and textual quality while maintaining a high inference speed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/06/2022

A Distributional Lens for Multi-Aspect Controllable Text Generation

Multi-aspect controllable text generation is a more challenging and prac...
research
12/16/2022

Controllable Text Generation via Probability Density Estimation in the Latent Space

Previous work on controllable text generation has explored the idea of c...
research
08/01/2022

Composable Text Control Operations in Latent Space with Ordinary Differential Equations

Real-world text applications often involve composing a wide range of tex...
research
10/21/2021

Controllable and Compositional Generation with Latent-Space Energy-Based Models

Controllable generation is one of the key requirements for successful ad...
research
10/19/2022

Language Detoxification with Attribute-Discriminative Latent Space

Transformer-based Language Models (LMs) achieve remarkable performances ...
research
06/16/2019

Fixing Gaussian Mixture VAEs for Interpretable Text Generation

Variational auto-encoder (VAE) with Gaussian priors is effective in text...
research
05/11/2021

Gradient flow encoding with distance optimization adaptive step size

The autoencoder model uses an encoder to map data samples to a lower dim...

Please sign up or login with your details

Forgot password? Click here to reset