Generating Sentences Using a Dynamic Canvas

06/13/2018
by   Harshil Shah, et al.
0

We introduce the Attentive Unsupervised Text (W)riter (AUTR), which is a word level generative model for natural language. It uses a recurrent neural network with a dynamic attention and canvas memory mechanism to iteratively construct sentences. By viewing the state of the memory at intermediate stages and where the model is placing its attention, we gain insight into how it constructs sentences. We demonstrate that AUTR learns a meaningful latent representation for each sentence, and achieves competitive log-likelihood lower bounds whilst being computationally efficient. It is effective at generating and reconstructing sentences, as well as imputing missing words.

READ FULL TEXT
research
11/19/2015

Generating Sentences from a Continuous Space

The standard recurrent neural network language model (RNNLM) generates s...
research
05/29/2018

Semantic Sentence Matching with Densely-connected Recurrent and Co-attentive Information

Sentence matching is widely used in various natural language tasks such ...
research
06/25/2022

Construct a Sentence with Multiple Specified Words

This paper demonstrates a task to finetune a BART model so it can constr...
research
09/28/2016

Hierarchical Memory Networks for Answer Selection on Unknown Words

Recently, end-to-end memory networks have shown promising results on Que...
research
11/01/2020

Seeing Both the Forest and the Trees: Multi-head Attention for Joint Classification on Different Compositional Levels

In natural languages, words are used in association to construct sentenc...
research
05/07/2015

Jointly Modeling Embedding and Translation to Bridge Video and Language

Automatically describing video content with natural language is a fundam...
research
07/06/2023

Statistical Mechanics of Strahler Number via Random and Natural Language Sentences

The Strahler number was originally proposed to characterize the complexi...

Please sign up or login with your details

Forgot password? Click here to reset