DeepAI AI Chat
Log In Sign Up

EmpTransfo: A Multi-head Transformer Architecture for Creating Empathetic Dialog Systems

03/05/2020
by   Rohola Zandie, et al.
University of Denver
0

Understanding emotions and responding accordingly is one of the biggest challenges of dialog systems. This paper presents EmpTransfo, a multi-head Transformer architecture for creating an empathetic dialog system. EmpTransfo utilizes state-of-the-art pre-trained models (e.g., OpenAI-GPT) for language generation, though models with different sizes can be used. We show that utilizing the history of emotions and other metadata can improve the quality of generated conversations by the dialog system. Our experimental results using a challenging language corpus show that the proposed approach outperforms other models in terms of Hit@1 and PPL (Perplexity).

READ FULL TEXT
05/11/2020

SOLOIST: Few-shot Task-Oriented Dialog with A Single Pre-trained Auto-regressive Model

This paper presents a new method SOLOIST, which uses transfer learning t...
04/24/2020

A Tailored Pre-Training Model for Task-Oriented Dialog Generation

The recent success of large pre-trained language models such as BERT and...
09/14/2022

SPACE-3: Unified Dialog Model Pre-training for Task-Oriented Dialog Understanding and Generation

Recently, pre-training methods have shown remarkable success in task-ori...
10/10/2022

Transformer-based Localization from Embodied Dialog with Large-scale Pre-training

We address the challenging task of Localization via Embodied Dialog (LED...
06/26/2020

Dialog as a Vehicle for Lifelong Learning

Dialog systems research has primarily been focused around two main types...
06/04/2019

Do Neural Dialog Systems Use the Conversation History Effectively? An Empirical Study

Neural generative models have been become increasingly popular when buil...
04/06/2022

Quick Starting Dialog Systems with Paraphrase Generation

Acquiring training data to improve the robustness of dialog systems can ...