Scientific and Creative Analogies in Pretrained Language Models

11/28/2022
by   Tamara Czinczoll, et al.
0

This paper examines the encoding of analogy in large-scale pretrained language models, such as BERT and GPT-2. Existing analogy datasets typically focus on a limited set of analogical relations, with a high similarity of the two domains between which the analogy holds. As a more realistic setup, we introduce the Scientific and Creative Analogy dataset (SCAN), a novel analogy dataset containing systematic mappings of multiple attributes and relational structures across dissimilar domains. Using this dataset, we test the analogical reasoning capabilities of several widely-used pretrained language models (LMs). We find that state-of-the-art LMs achieve low performance on these complex analogy tasks, highlighting the challenges still posed by analogy understanding.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/13/2021

Discourse Probing of Pretrained Language Models

Existing work on probing of pretrained language models (LMs) has predomi...
research
05/22/2023

Beneath Surface Similarity: Large Language Models Make Reasonable Scientific Analogies after Structure Abduction

Analogical reasoning is essential for human cognition, allowing us to co...
research
09/16/2021

Efficient Attribute Injection for Pretrained Language Models

Metadata attributes (e.g., user and product IDs from reviews) can be inc...
research
05/04/2020

The Sensitivity of Language Models and Humans to Winograd Schema Perturbations

Large-scale pretrained language models are the major driving force behin...
research
07/01/2021

Leveraging Domain Agnostic and Specific Knowledge for Acronym Disambiguation

An obstacle to scientific document understanding is the extensive use of...
research
10/16/2020

Linguistically-Informed Transformations (LIT): A Method forAutomatically Generating Contrast Sets

Although large-scale pretrained language models, such as BERT and RoBERT...

Please sign up or login with your details

Forgot password? Click here to reset