ENCONTER: Entity Constrained Progressive Sequence Generation via Insertion-based Transformer

03/17/2021
by   Lee-Hsun Hsieh, et al.
0

Pretrained using large amount of data, autoregressive language models are able to generate high quality sequences. However, these models do not perform well under hard lexical constraints as they lack fine control of content generation process. Progressive insertion-based transformers can overcome the above limitation and efficiently generate a sequence in parallel given some input tokens as constraint. These transformers however may fail to support hard lexical constraints as their generation process is more likely to terminate prematurely. The paper analyses such early termination problems and proposes the Entity-constrained insertion transformer (ENCONTER), a new insertion transformer that addresses the above pitfall without compromising much generation efficiency. We introduce a new training strategy that considers predefined hard lexical constraints (e.g., entities to be included in the generated sequence). Our experiments show that ENCONTER outperforms other baseline models in several performance metrics rendering it more suitable in practical applications. Our code is available at https://github.com/LARC-CMU-SMU/Enconter

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/15/2023

Cure the headache of Transformers via Collinear Constrained Attention

As the rapid progression of practical applications based on Large Langua...
research
05/01/2020

POINTER: Constrained Text Generation via Insertion-based Generative Pre-training

Large-scale pre-trained language models, such as BERT and GPT-2, have ac...
research
08/03/2021

Vision Transformer with Progressive Sampling

Transformers with powerful global relation modeling abilities have been ...
research
04/18/2021

Extract, Denoise, and Enforce: Evaluating and Predicting Lexical Constraints for Conditional Text Generation

Recently, pre-trained language models (PLMs) have dominated conditional ...
research
06/21/2023

Towards Accurate Translation via Semantically Appropriate Application of Lexical Constraints

Lexically-constrained NMT (LNMT) aims to incorporate user-provided termi...
research
11/13/2020

EDITOR: an Edit-Based Transformer with Repositioning for Neural Machine Translation with Soft Lexical Constraints

We introduce an Edit-Based Transformer with Repositioning (EDITOR), whic...
research
06/08/2021

FastSeq: Make Sequence Generation Faster

Transformer-based models have made tremendous impacts in natural languag...

Please sign up or login with your details

Forgot password? Click here to reset