Domain Knowledge Distillation from Large Language Model: An Empirical Study in the Autonomous Driving Domain

07/17/2023
by   Yun Tang, et al.
0

Engineering knowledge-based (or expert) systems require extensive manual effort and domain knowledge. As Large Language Models (LLMs) are trained using an enormous amount of cross-domain knowledge, it becomes possible to automate such engineering processes. This paper presents an empirical automation and semi-automation framework for domain knowledge distillation using prompt engineering and the LLM ChatGPT. We assess the framework empirically in the autonomous driving domain and present our key observations. In our implementation, we construct the domain knowledge ontology by "chatting" with ChatGPT. The key finding is that while fully automated domain ontology construction is possible, human supervision and early intervention typically improve efficiency and output quality as they lessen the effects of response randomness and the butterfly effect. We, therefore, also develop a web-based distillation assistant enabling supervision and flexible intervention at runtime. We hope our findings and tools could inspire future research toward revolutionizing the engineering of knowledge-based systems across application domains.

READ FULL TEXT

page 1

page 6

page 8

research
04/29/2021

Spirit Distillation: A Model Compression Method with Multi-domain Knowledge Transfer

Recent applications pose requirements of both cross-domain knowledge tra...
research
09/06/2022

Contextualizing Large-Scale Domain Knowledge for Conceptual Modeling and Simulation

We present an interactive modeling tool, VERA, that scaffolds the acquis...
research
12/31/2021

Data-Free Knowledge Transfer: A Survey

In the last decade, many deep learning models have been well trained and...
research
11/12/2020

A Knowledge Representation Approach to Automated Mathematical Modelling

Mathematicians formulate complex mathematical models based on user requi...
research
10/16/2021

HRKD: Hierarchical Relational Knowledge Distillation for Cross-domain Language Model Compression

On many natural language processing tasks, large pre-trained language mo...
research
02/23/2021

Grounded Relational Inference: Domain Knowledge Driven Explainable Autonomous Driving

Explainability is essential for autonomous vehicles and other robotics s...
research
04/01/2017

An Ontological Architecture for Orbital Debris Data

The orbital debris problem presents an opportunity for inter-agency and ...

Please sign up or login with your details

Forgot password? Click here to reset