Symbolic Knowledge Distillation: from General Language Models to Commonsense Models

10/14/2021
by   Peter West, et al.
0

The common practice for training commonsense models has gone from-human-to-corpus-to-machine: humans author commonsense knowledge graphs in order to train commonsense models. In this work, we investigate an alternative, from-machine-to-corpus-to-machine: general language models author these commonsense knowledge graphs to train commonsense models. Our study leads to a new framework, Symbolic Knowledge Distillation. As with prior art in Knowledge Distillation (Hinton et al., 2015), our approach uses larger models to teach smaller models. A key difference is that we distill knowledge symbolically-as text-in addition to the neural model. We also distill only one aspect-the commonsense of a general language model teacher, allowing the student to be a different type, a commonsense model. Altogether, we show that careful prompt engineering and a separately trained critic model allow us to selectively distill high-quality causal commonsense from GPT-3, a general language model. Empirical results demonstrate that, for the first time, a human-authored commonsense knowledge graph is surpassed by our automatically distilled variant in all three criteria: quantity, quality, and diversity. In addition, it results in a neural commonsense model that surpasses the teacher model's commonsense capabilities despite its 100x smaller size. We apply this to the ATOMIC resource, and share our new symbolic knowledge graph and commonsense models.

READ FULL TEXT
research
06/12/2019

COMET: Commonsense Transformers for Automatic Knowledge Graph Construction

We present the first comprehensive study on automatic knowledge base con...
research
10/12/2020

COMET-ATOMIC 2020: On Symbolic and Neural Commonsense Knowledge Graphs

Recent years have brought about a renewed interest in commonsense repres...
research
10/07/2019

Exploiting Structural and Semantic Context for Commonsense Knowledge Base Completion

Automatic KB completion for commonsense knowledge graphs (e.g., ATOMIC a...
research
04/20/2021

Identify, Align, and Integrate: Matching Knowledge Graphs to Commonsense Reasoning Tasks

Integrating external knowledge into commonsense reasoning tasks has show...
research
06/06/2022

Neuro-Symbolic Causal Language Planning with Commonsense Prompting

Language planning aims to implement complex high-level goals by decompos...
research
02/10/2023

Adversarial Transformer Language Models for Contextual Commonsense Inference

Contextualized or discourse aware commonsense inference is the task of g...
research
06/17/2023

Snowman: A Million-scale Chinese Commonsense Knowledge Graph Distilled from Foundation Model

Constructing commonsense knowledge graphs (CKGs) has attracted wide rese...

Please sign up or login with your details

Forgot password? Click here to reset