APo-VAE: Text Generation in Hyperbolic Space

04/30/2020
by   Shuyang Dai, et al.
0

Natural language often exhibits inherent hierarchical structure ingrained with complex syntax and semantics. However, most state-of-the-art deep generative models learn embeddings only in Euclidean vector space, without accounting for this structural property of language. In this paper, we investigate text generation in a hyperbolic latent space to learn continuous hierarchical representations. An Adversarial Poincare Variational Autoencoder (APo-VAE) is presented, where both the prior and variational posterior of latent variables are defined over a Poincare ball via wrapped normal distributions. By adopting the primal-dual formulation of KL divergence, an adversarial learning procedure is introduced to empower robust model training. Extensive experiments in language modeling and dialog-response generation tasks demonstrate the winning effectiveness of the proposed APo-VAE model over VAEs in Euclidean latent space, thanks to its superb capabilities in capturing latent language hierarchies in hyperbolic space.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/02/2020

Improving Variational Autoencoder for Text Modelling with Timestep-Wise Regularisation

The Variational Autoencoder (VAE) is a popular and powerful model applie...
research
12/03/2020

Learning Hyperbolic Representations for Unsupervised 3D Segmentation

There exists a need for unsupervised 3D segmentation on complex volumetr...
research
08/28/2018

Hierarchical Quantized Representations for Script Generation

Scripts define knowledge about how everyday scenarios (such as going to ...
research
01/05/2019

Poincaré Wasserstein Autoencoder

This work presents a reformulation of the recently proposed Wasserstein ...
research
04/21/2020

Discrete Variational Attention Models for Language Generation

Variational autoencoders have been widely applied for natural language g...
research
08/27/2018

Natural Language Generation with Neural Variational Models

In this thesis, we explore the use of deep neural networks for generatio...
research
04/10/2018

A Hierarchical Latent Structure for Variational Conversation Modeling

Variational autoencoders (VAE) combined with hierarchical RNNs have emer...

Please sign up or login with your details

Forgot password? Click here to reset