Generative Pre-Trained Transformer for Design Concept Generation: An Exploration

11/16/2021
by   Qihao Zhu, et al.
0

Novel concepts are essential for design innovation and can be generated with the aid of data stimuli and computers. However, current generative design algorithms focus on diagrammatic or spatial concepts that are either too abstract to understand or too detailed for early phase design exploration. This paper explores the uses of generative pre-trained transformers (GPT) for natural language design concept generation. Our experiments involve the use of GPT-2 and GPT-3 for different creative reasonings in design tasks. Both show reasonably good performance for verbal design concept generation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/07/2022

Generative Transformers for Design Concept Generation

Generating novel and useful concepts is essential during the early desig...
research
12/26/2022

Biologically Inspired Design Concept Generation Using Generative Pre-Trained Transformers

Biological systems in nature have evolved for millions of years to adapt...
research
07/28/2017

Generation of concept-representative symbols

The visual representation of concepts or ideas through the use of simple...
research
01/25/2016

Concept Generation in Language Evolution

This thesis investigates the generation of new concepts from combination...
research
01/15/2022

AI-Assisted Design Concept Exploration Through Character Space Construction

We propose an AI-assisted design concept exploration tool, the "Characte...
research
04/04/2021

Recommending Metamodel Concepts during Modeling Activities with Pre-Trained Language Models

The design of conceptually sound metamodels that embody proper semantics...
research
10/09/2022

Analogy Generation by Prompting Large Language Models: A Case Study of InstructGPT

We propose a novel application of prompting Pre-trained Language Models ...

Please sign up or login with your details

Forgot password? Click here to reset