Metaphorical Paraphrase Generation: Feeding Metaphorical Language Models with Literal Texts

10/10/2022
by   Giorgio Ottolina, et al.
0

This study presents a new approach to metaphorical paraphrase generation by masking literal tokens of literal sentences and unmasking them with metaphorical language models. Unlike similar studies, the proposed algorithm does not only focus on verbs but also on nouns and adjectives. Despite the fact that the transfer rate for the former is the highest (56 latter is feasible (24 system-generated metaphors are considered more creative and metaphorical than human-generated ones while when using our transferred metaphors for data augmentation improves the state of the art in metaphorical sentence classification by 3

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2020

Unsupervised Paraphrase Generation using Pre-trained Language Models

Large scale Pre-trained Language Models have proven to be very powerful ...
research
08/23/2018

The Importance of Generation Order in Language Modeling

Neural language models are a critical component of state-of-the-art syst...
research
04/18/2021

GPT3Mix: Leveraging Large-scale Language Models for Text Augmentation

Large-scale language models such as GPT-3 are excellent few-shot learner...
research
05/28/2023

Targeted Data Generation: Finding and Fixing Model Weaknesses

Even when aggregate accuracy is high, state-of-the-art NLP models often ...
research
05/11/2023

Autocorrelations Decay in Texts and Applicability Limits of Language Models

We show that the laws of autocorrelations decay in texts are closely rel...
research
05/24/2023

Deriving Language Models from Masked Language Models

Masked language models (MLM) do not explicitly define a distribution ove...
research
06/15/2022

BaIT: Barometer for Information Trustworthiness

This paper presents a new approach to the FNC-1 fake news classification...

Please sign up or login with your details

Forgot password? Click here to reset