Semantic-based Pre-training for Dialogue Understanding

09/19/2022
by   Xuefeng Bai, et al.
0

Pre-trained language models have made great progress on dialogue tasks. However, these models are typically trained on surface dialogue text, thus are proven to be weak in understanding the main semantic meaning of a dialogue context. We investigate Abstract Meaning Representation (AMR) as explicit semantic knowledge for pre-training models to capture the core semantic information in dialogues during pre-training. In particular, we propose a semantic-based pre-training framework that extends the standard pre-training framework (Devlin et al., 2019) by three tasks for learning 1) core semantic units, 2) semantic relations and 3) the overall semantic representation according to AMR graphs. Experiments on the understanding of both chit-chats and task-oriented dialogues show the superiority of our model. To our knowledge, we are the first to leverage a deep semantic representation for dialogue pre-training.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/15/2022

Graph Pre-training for AMR Parsing and Generation

Abstract meaning representation (AMR) highlights the core semantic infor...
research
05/21/2021

Semantic Representation for Dialogue Modeling

Although neural models have achieved competitive results in dialogue sys...
research
05/18/2023

Causal Document-Grounded Dialogue Pre-training

The goal of document-grounded dialogue (DocGD) is to generate a response...
research
07/08/2022

DSTEA: Dialogue State Tracking with Entity Adaptive Pre-training

Dialogue state tracking (DST) is a core sub-module of a dialogue system,...
research
10/17/2019

PLATO: Pre-trained Dialogue Generation Model with Discrete Latent Variable

Pre-training models have been proved effective for a wide range of natur...
research
01/31/2023

Friend-training: Learning from Models of Different but Related Tasks

Current self-training methods such as standard self-training, co-trainin...
research
10/26/2020

Probing Task-Oriented Dialogue Representation from Language Models

This paper investigates pre-trained language models to find out which mo...

Please sign up or login with your details

Forgot password? Click here to reset