Construct a Sentence with Multiple Specified Words

06/25/2022
by   Yuanliang Meng, et al.
0

This paper demonstrates a task to finetune a BART model so it can construct a sentence from an arbitrary set of words, which used to be a difficult NLP task. The training task is making sentences with four words, but the trained model can generate sentences when fewer or more words are provided. The output sentences have high quality in general. The model can have some real-world applications, and this task can be used as an evaluation mechanism for any language model as well.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/06/2019

From Receptive to Productive: Learning to Use Confusing Words through Automatically Selected Example Sentences

Knowing how to use words appropriately has been a key to improving langu...
research
05/19/2022

Sentences as connection paths: A neural language architecture of sentence structure in the brain

This article presents a neural language architecture of sentence structu...
research
05/25/2021

Understanding Mobile GUI: from Pixel-Words to Screen-Sentences

The ubiquity of mobile phones makes mobile GUI understanding an importan...
research
12/02/2019

Fiction Sentence Expansion and Enhancement via Focused Objective and Novelty Curve Sampling

We describe the task of sentence expansion and enhancement, in which a s...
research
07/14/2021

Composing Conversational Negation

Negation in natural language does not follow Boolean logic and is theref...
research
11/06/2018

Learning to Embed Sentences Using Attentive Recursive Trees

Sentence embedding is an effective feature representation for most deep ...
research
06/13/2018

Generating Sentences Using a Dynamic Canvas

We introduce the Attentive Unsupervised Text (W)riter (AUTR), which is a...

Please sign up or login with your details

Forgot password? Click here to reset