Analogy Generation by Prompting Large Language Models: A Case Study of InstructGPT

10/09/2022
by   Bhavya Bhavya, et al.
0

We propose a novel application of prompting Pre-trained Language Models (PLMs) to generate analogies and study how to design effective prompts for two task settings: generating a source concept analogous to a given target concept (aka Analogous Concept Generation or ACG), and generating an explanation of the similarity between a given pair of target concept and source concept (aka Analogous Explanation Generation or AEG). We found that it is feasible to prompt InstructGPT to generate meaningful analogies and the best prompts tend to be precise imperative statements especially with a low temperature setting. We also systematically analyzed the sensitivity of the InstructGPT model to prompt design, temperature, and injected spelling errors, and found that the model is particularly sensitive to certain variations (e.g., questions vs. imperative statements). Further, we conducted human evaluation on 1.4k of the generated analogies and found that the quality of generations varies substantially by model size. The largest InstructGPT model can achieve human-level performance at generating meaningful analogies for a given target while there is still room for improvement on the AEG task.

READ FULL TEXT
research
09/11/2022

Chain of Explanation: New Prompting Method to Generate Higher Quality Natural Language Explanation for Implicit Hate Speech

Recent studies have exploited advanced generative language models to gen...
research
05/09/2023

Towards an Automatic Optimisation Model Generator Assisted with Generative Pre-trained Transformer

This article presents a framework for generating optimisation models usi...
research
06/14/2021

Automatic Document Sketching: Generating Drafts from Analogous Texts

The advent of large pre-trained language models has made it possible to ...
research
04/11/2022

Explanation Graph Generation via Pre-trained Language Models: An Empirical Study with Contrastive Learning

Pre-trained sequence-to-sequence language models have led to widespread ...
research
04/04/2022

Using Pre-Trained Language Models for Producing Counter Narratives Against Hate Speech: a Comparative Study

In this work, we present an extensive study on the use of pre-trained la...
research
03/13/2023

Architext: Language-Driven Generative Architecture Design

Architectural design is a highly complex practice that involves a wide d...
research
11/16/2021

Generative Pre-Trained Transformer for Design Concept Generation: An Exploration

Novel concepts are essential for design innovation and can be generated ...

Please sign up or login with your details

Forgot password? Click here to reset