Hierarchical Quantized Representations for Script Generation

08/28/2018
by   Noah Weber, et al.
0

Scripts define knowledge about how everyday scenarios (such as going to a restaurant) are expected to unfold. One of the challenges to learning scripts is the hierarchical nature of the knowledge. For example, a suspect arrested might plead innocent or guilty, and a very different track of events is then expected to happen. To capture this type of information, we propose an autoencoder model with a latent space defined by a hierarchy of categorical variables. We utilize a recently proposed vector quantization based approach, which allows continuous embeddings to be associated with each latent variable value. This permits the decoder to softly decide what portions of the latent hierarchy to condition on by attending over the value embeddings for a given setting. Our model effectively encodes and generates scripts, outperforming a recent language modeling-based method on several standard tasks, and allowing the autoencoder model to achieve substantially lower perplexity scores compared to the previous language modeling-based method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/30/2020

APo-VAE: Text Generation in Hyperbolic Space

Natural language often exhibits inherent hierarchical structure ingraine...
research
11/07/2022

Learning Semantic Textual Similarity via Topic-informed Discrete Latent Variables

Recently, discrete latent variable models have received a surge of inter...
research
06/07/2018

Self-Consistent Trajectory Autoencoder: Hierarchical Reinforcement Learning with Trajectory Embeddings

In this work, we take a representation learning perspective on hierarchi...
research
04/04/2019

Riemannian Normalizing Flow on Variational Wasserstein Autoencoder for Text Modeling

Recurrent Variational Autoencoder has been widely used for language mode...
research
12/20/2022

Semantically-informed Hierarchical Event Modeling

Prior work has shown that coupling sequential latent variable models wit...
research
01/29/2018

Discrete Autoencoders for Sequence Models

Recurrent models for sequences have been recently successful at many tas...
research
12/09/2021

Learning Transferable Motor Skills with Hierarchical Latent Mixture Policies

For robots operating in the real world, it is desirable to learn reusabl...

Please sign up or login with your details

Forgot password? Click here to reset