Meta-Context Transformers for Domain-Specific Response Generation

10/12/2020
by   Debanjana Kar, et al.
0

Despite the tremendous success of neural dialogue models in recent years, it suffers a lack of relevance, diversity, and some times coherence in generated responses. Lately, transformer-based models, such as GPT-2, have revolutionized the landscape of dialogue generation by capturing the long-range structures through language modeling. Though these models have exhibited excellent language coherence, they often lack relevance and terms when used for domain-specific response generation. In this paper, we present DSRNet (Domain Specific Response Network), a transformer-based model for dialogue response generation by reinforcing domain-specific attributes. In particular, we extract meta attributes from context and infuse them with the context utterances for better attention over domain-specific key terms and relevance. We study the use of DSRNet in a multi-turn multi-interlocutor environment for domain-specific response generation. In our experiments, we evaluate DSRNet on Ubuntu dialogue datasets, which are mainly composed of various technical domain related dialogues for IT domain issue resolutions and also on CamRest676 dataset, which contains restaurant domain conversations. Trained with maximum likelihood objective, our model shows significant improvement over the state-of-the-art for multi-turn dialogue systems supported by better BLEU and semantic similarity (BertScore) scores. Besides, we also observe that the responses produced by our model carry higher relevance due to the presence of domain-specific key attributes that exhibit better overlap with the attributes of the context. Our analysis shows that the performance improvement is mostly due to the infusion of key terms along with dialogues which result in better attention over domain-relevant terms. Other contributing factors include joint modeling of dialogue context with the domain-specific meta attributes and topics.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/26/2019

Multi-turn Dialogue Response Generation with Autoregressive Transformer Models

Neural dialogue models, despite their successes, still suffer from lack ...
research
09/10/2018

Improving Response Selection in Multi-turn Dialogue Systems

Building systems that can communicate with humans is a core problem in A...
research
10/12/2022

Towards Generalized and Explainable Long-Range Context Representation for Dialogue Systems

Context representation is crucial to both dialogue understanding and gen...
research
02/18/2021

Learning to Select Context in a Hierarchical and Global Perspective for Open-domain Dialogue Generation

Open-domain multi-turn conversations mainly have three features, which a...
research
09/03/2019

Adversarial Bootstrapping for Dialogue Model Training

Open domain neural dialogue models, despite their successes, are known t...
research
02/08/2022

Logical Reasoning for Task Oriented Dialogue Systems

In recent years, large pretrained models have been used in dialogue syst...
research
09/18/2018

Better Conversations by Modeling,Filtering,and Optimizing for Coherence and Diversity

We present three enhancements to existing encoder-decoder models for ope...

Please sign up or login with your details

Forgot password? Click here to reset