Prior Attention for Style-aware Sequence-to-Sequence Models

06/25/2018
by   Lucas Sterckx, et al.
0

We extend sequence-to-sequence models with the possibility to control the characteristics or style of the generated output, via attention that is generated a priori (before decoding) from a latent code vector. After training an initial attention-based sequence-to-sequence model, we use a variational auto-encoder conditioned on representations of input sequences and a latent code vector space to generate attention matrices. By sampling the code vector from specific regions of this latent space during decoding and imposing prior attention generated from it in the seq2seq model, output can be steered towards having certain attributes. This is demonstrated for the task of sentence simplification, where the latent code vector allows control over output length and lexical simplification, and enables fine-tuning to optimize for different evaluation metrics.

READ FULL TEXT
research
08/27/2018

Natural Language Generation with Neural Variational Models

In this thesis, we explore the use of deep neural networks for generatio...
research
09/26/2016

Online Segment to Segment Neural Transduction

We introduce an online neural sequence to sequence model that learns to ...
research
07/05/2020

Automatically Generating Codes from Graphical Screenshots Based on Deep Autocoder

During software front-end development, the work to convert Graphical Use...
research
12/21/2017

Variational Attention for Sequence-to-Sequence Models

The variational encoder-decoder (VED) encodes source information as a se...
research
08/09/2022

Learning to Improve Code Efficiency

Improvements in the performance of computing systems, driven by Moore's ...
research
05/23/2018

Amortized Context Vector Inference for Sequence-to-Sequence Networks

Neural attention (NA) is an effective mechanism for inferring complex st...
research
09/26/2019

Attention Forcing for Sequence-to-sequence Model Training

Auto-regressive sequence-to-sequence models with attention mechanism hav...

Please sign up or login with your details

Forgot password? Click here to reset