Event Representation with Sequential, Semi-Supervised Discrete Variables

Within the context of event modeling and understanding, we propose a new method for neural sequence modeling that takes partially-observed sequences of discrete, external knowledge into account. We construct a sequential, neural variational autoencoder that uses a carefully defined encoder, and Gumbel-Softmax reparametrization, to allow for successful backpropagation during training. We show that our approach outperforms multiple baselines and the state-of-the-art in narrative script induction on multiple event modeling tasks. We demonstrate that our approach converges more quickly.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/20/2022

Intentional Choreography with Semi-Supervised Recurrent VAEs

We summarize the model and results of PirouNet, a semi-supervised recurr...
research
06/15/2020

Evidence-Aware Inferential Text Generation with Vector Quantised Variational AutoEncoder

Generating inferential texts about an event in different perspectives re...
research
12/20/2022

Semantically-informed Hierarchical Event Modeling

Prior work has shown that coupling sequential latent variable models wit...
research
09/19/2019

Modeling Event Background for If-Then Commonsense Reasoning Using Context-aware Variational Autoencoder

Understanding event and event-centered commonsense reasoning are crucial...
research
02/12/2022

Semi-supervised New Event Type Induction and Description via Contrastive Loss-Enforced Batch Attention

Most event extraction methods have traditionally relied on an annotated ...
research
11/21/2017

Event Representations with Tensor-based Compositions

Robust and flexible event representations are important to many core are...
research
04/25/2017

Taxonomy Induction using Hypernym Subsequences

We propose a novel, semi-supervised approach towards domain taxonomy ind...

Please sign up or login with your details

Forgot password? Click here to reset