DialogueBERT: A Self-Supervised Learning based Dialogue Pre-training Encoder

09/22/2021
by   Zhenyu Zhang, et al.
0

With the rapid development of artificial intelligence, conversational bots have became prevalent in mainstream E-commerce platforms, which can provide convenient customer service timely. To satisfy the user, the conversational bots need to understand the user's intention, detect the user's emotion, and extract the key entities from the conversational utterances. However, understanding dialogues is regarded as a very challenging task. Different from common language understanding, utterances in dialogues appear alternately from different roles and are usually organized as hierarchical structures. To facilitate the understanding of dialogues, in this paper, we propose a novel contextual dialogue encoder (i.e. DialogueBERT) based on the popular pre-trained language model BERT. Five self-supervised learning pre-training tasks are devised for learning the particularity of dialouge utterances. Four different input embeddings are integrated to catch the relationship between utterances, including turn embedding, role embedding, token embedding and position embedding. DialogueBERT was pre-trained with 70 million dialogues in real scenario, and then fine-tuned in three different downstream dialogue understanding tasks. Experimental results show that DialogueBERT achieves exciting results with 88.63 for emotion recognition and 97.04 outperforms several strong baselines by a large margin.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2021

Structural Pre-training for Dialogue Comprehension

Pre-trained language models (PrLMs) have demonstrated superior performan...
research
04/14/2021

K-PLUG: Knowledge-injected Pre-trained Language Model for Natural Language Understanding and Generation in E-Commerce

Existing pre-trained language models (PLMs) have demonstrated the effect...
research
06/04/2021

Self-supervised Dialogue Learning for Spoken Conversational Question Answering

In spoken conversational question answering (SCQA), the answer to the co...
research
09/01/2022

Enhancing Semantic Understanding with Self-supervised Methods for Abstractive Dialogue Summarization

Contextualized word embeddings can lead to state-of-the-art performances...
research
08/31/2022

Unified Knowledge Prompt Pre-training for Customer Service Dialogues

Dialogue bots have been widely applied in customer service scenarios to ...
research
10/30/2020

Improving Dialogue Breakdown Detection with Semi-Supervised Learning

Building user trust in dialogue agents requires smooth and consistent di...
research
03/02/2023

Matching-based Term Semantics Pre-training for Spoken Patient Query Understanding

Medical Slot Filling (MSF) task aims to convert medical queries into str...

Please sign up or login with your details

Forgot password? Click here to reset