Construct a Sentence with Multiple Specified Words

by   Yuanliang Meng, et al.
Nuance Communications

This paper demonstrates a task to finetune a BART model so it can construct a sentence from an arbitrary set of words, which used to be a difficult NLP task. The training task is making sentences with four words, but the trained model can generate sentences when fewer or more words are provided. The output sentences have high quality in general. The model can have some real-world applications, and this task can be used as an evaluation mechanism for any language model as well.


page 1

page 2

page 3

page 4


From Receptive to Productive: Learning to Use Confusing Words through Automatically Selected Example Sentences

Knowing how to use words appropriately has been a key to improving langu...

Sentences as connection paths: A neural language architecture of sentence structure in the brain

This article presents a neural language architecture of sentence structu...

Understanding Mobile GUI: from Pixel-Words to Screen-Sentences

The ubiquity of mobile phones makes mobile GUI understanding an importan...

Fiction Sentence Expansion and Enhancement via Focused Objective and Novelty Curve Sampling

We describe the task of sentence expansion and enhancement, in which a s...

Composing Conversational Negation

Negation in natural language does not follow Boolean logic and is theref...

Learning to Embed Sentences Using Attentive Recursive Trees

Sentence embedding is an effective feature representation for most deep ...

Generating Sentences Using a Dynamic Canvas

We introduce the Attentive Unsupervised Text (W)riter (AUTR), which is a...

Please sign up or login with your details

Forgot password? Click here to reset