DeepAI AI Chat
Log In Sign Up

Empathetic Dialogue Generation with Pre-trained RoBERTa-GPT2 and External Knowledge

by   Ye Liu, et al.

One challenge for dialogue agents is to recognize feelings of the conversation partner and respond accordingly. In this work, RoBERTa-GPT2 is proposed for empathetic dialogue generation, where the pre-trained auto-encoding RoBERTa is utilised as encoder and the pre-trained auto-regressive GPT-2 as decoder. With the combination of the pre-trained RoBERTa and GPT-2, our model realizes a new state-of-the-art emotion accuracy. To enable the empathetic ability of RoBERTa-GPT2 model, we propose a commonsense knowledge and emotional concepts extractor, in which the commonsensible and emotional concepts of dialogue context are extracted for the GPT-2 decoder. The experiment results demonstrate that the empathetic dialogue generation benefits from both pre-trained encoder-decoder architecture and external knowledge.


page 1

page 2

page 3

page 4


Commonsense-Aware Prompting for Controllable Empathetic Dialogue Generation

Improving the emotional awareness of pre-trained language models is an e...

ConceptNet infused DialoGPT for Underlying Commonsense Understanding and Reasoning in Dialogue Response Generation

The pre-trained conversational models still fail to capture the implicit...

Topic-Driven and Knowledge-Aware Transformer for Dialogue Emotion Detection

Emotion detection in dialogues is challenging as it often requires the i...

KFCNet: Knowledge Filtering and Contrastive Learning Network for Generative Commonsense Reasoning

Pre-trained language models have led to substantial gains over a broad r...

Open-Domain Dialogue Generation Based on Pre-trained Language Models

Pre-trained language models have been successfully used in response gene...

Contrast and Generation Make BART a Good Dialogue Emotion Recognizer

In dialogue systems, utterances with similar semantics may have distinct...