Multi-Domain Dialogue State Tracking – A Purely Transformer-Based Generative Approach

10/24/2020
by   Yan Zeng, et al.
0

We investigate the problem of multi-domain Dialogue State Tracking (DST) with open vocabulary. Existing approaches exploit BERT encoder and copy-based RNN decoder, where the encoder first predicts the state operation, and then the decoder generates new slot values. However, in this stacked encoder-decoder structure, the operation prediction objective only affects the BERT encoder and the value generation objective mainly affects the RNN decoder. In this paper, we propose a purely Transformer-based framework that uses BERT as both encoder and decoder. In so doing, the operation prediction objective and the value generation objective can jointly optimize our model for DST. At the decoding step, we re-use the hidden states of the encoder in the self-attention mechanism of the corresponding decoder layer to construct a flat model structure for effective parameter updating. Experimental results show that our approach substantially outperforms the existing state-of-the-art framework, and it also achieves very competitive performance to the best ontology-based approaches.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

10/21/2020

Multi-Domain Dialogue State Tracking based on State Graph

We investigate the problem of multi-domain Dialogue State Tracking (DST)...
11/10/2019

Efficient Dialogue State Tracking by Selectively Overwriting Memory

Recent works in dialogue state tracking (DST) focus on an open vocabular...
07/25/2021

Learn to Focus: Hierarchical Dynamic Copy Network for Dialogue State Tracking

Recently, researchers have explored using the encoder-decoder framework ...
05/25/2016

Review Networks for Caption Generation

We propose a novel extension of the encoder-decoder framework, called a ...
09/28/2021

Nana-HDR: A Non-attentive Non-autoregressive Hybrid Model for TTS

This paper presents Nana-HDR, a new non-attentive non-autoregressive mod...
09/27/2021

Improving Stack Overflow question title generation with copying enhanced CodeBERT model and bi-modal information

Context: Stack Overflow is very helpful for software developers who are ...
10/06/2020

Incorporating Behavioral Hypotheses for Query Generation

Generative neural networks have been shown effective on query suggestion...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.