Construct a Sentence with Multiple Specified Words

06/25/2022
by   Yuanliang Meng, et al.
0

This paper demonstrates a task to finetune a BART model so it can construct a sentence from an arbitrary set of words, which used to be a difficult NLP task. The training task is making sentences with four words, but the trained model can generate sentences when fewer or more words are provided. The output sentences have high quality in general. The model can have some real-world applications, and this task can be used as an evaluation mechanism for any language model as well.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset