DeepAI AI Chat
Log In Sign Up

EmpTransfo: A Multi-head Transformer Architecture for Creating Empathetic Dialog Systems

by   Rohola Zandie, et al.
University of Denver

Understanding emotions and responding accordingly is one of the biggest challenges of dialog systems. This paper presents EmpTransfo, a multi-head Transformer architecture for creating an empathetic dialog system. EmpTransfo utilizes state-of-the-art pre-trained models (e.g., OpenAI-GPT) for language generation, though models with different sizes can be used. We show that utilizing the history of emotions and other metadata can improve the quality of generated conversations by the dialog system. Our experimental results using a challenging language corpus show that the proposed approach outperforms other models in terms of Hit@1 and PPL (Perplexity).


SOLOIST: Few-shot Task-Oriented Dialog with A Single Pre-trained Auto-regressive Model

This paper presents a new method SOLOIST, which uses transfer learning t...

A Tailored Pre-Training Model for Task-Oriented Dialog Generation

The recent success of large pre-trained language models such as BERT and...

SPACE-3: Unified Dialog Model Pre-training for Task-Oriented Dialog Understanding and Generation

Recently, pre-training methods have shown remarkable success in task-ori...

Transformer-based Localization from Embodied Dialog with Large-scale Pre-training

We address the challenging task of Localization via Embodied Dialog (LED...

Dialog as a Vehicle for Lifelong Learning

Dialog systems research has primarily been focused around two main types...

Do Neural Dialog Systems Use the Conversation History Effectively? An Empirical Study

Neural generative models have been become increasingly popular when buil...

Quick Starting Dialog Systems with Paraphrase Generation

Acquiring training data to improve the robustness of dialog systems can ...