Knowledge-Grounded Dialogue Generation with a Unified Knowledge Representation

12/15/2021
by   Yu Li, et al.
0

Knowledge-grounded dialogue systems are challenging to build due to the lack of training data and heterogeneous knowledge sources. Existing systems perform poorly on unseen topics due to limited topics covered in the training data. In addition, heterogeneous knowledge sources make it challenging for systems to generalize to other tasks because knowledge sources in different knowledge representations require different knowledge encoders. To address these challenges, we present PLUG, a language model that homogenizes different knowledge sources to a unified knowledge representation for knowledge-grounded dialogue generation tasks. PLUG is pre-trained on a dialogue generation task conditioned on a unified essential knowledge representation. It can generalize to different downstream knowledge-grounded dialogue generation tasks with a few training examples. The empirical evaluation on two benchmarks shows that our model generalizes well across different knowledge-grounded tasks. It can achieve comparable performance with state-of-the-art methods under a fully-supervised setting and significantly outperforms other methods in zero-shot and few-shot settings.

READ FULL TEXT
research
08/29/2020

Zero-Resource Knowledge-Grounded Dialogue Generation

While neural conversation models have shown great potentials towards gen...
research
06/06/2023

TwistList: Resources and Baselines for Tongue Twister Generation

Previous work in phonetically-grounded language generation has mainly fo...
research
04/22/2022

FaithDial: A Faithful Benchmark for Information-Seeking Dialogue

The goal of information-seeking dialogue is to respond to seeker queries...
research
02/24/2020

Low-Resource Knowledge-Grounded Dialogue Generation

Responding with knowledge has been recognized as an important capability...
research
03/01/2023

Grounded Decoding: Guiding Text Generation with Grounded Models for Robot Control

Recent progress in large language models (LLMs) has demonstrated the abi...
research
09/16/2021

Transferable Persona-Grounded Dialogues via Grounded Minimal Edits

Grounded dialogue models generate responses that are grounded on certain...
research
09/09/2021

A Three-Stage Learning Framework for Low-Resource Knowledge-Grounded Dialogue Generation

Neural conversation models have shown great potentials towards generatin...

Please sign up or login with your details

Forgot password? Click here to reset