Guider l'attention dans les modeles de sequence a sequence pour la prediction des actes de dialogue

by   Pierre Colombo, et al.

The task of predicting dialog acts (DA) based on conversational dialog is a key component in the development of conversational agents. Accurately predicting DAs requires a precise modeling of both the conversation and the global tag dependencies. We leverage seq2seq approaches widely adopted in Neural Machine Translation (NMT) to improve the modelling of tag sequentiality. Seq2seq models are known to learn complex global dependencies while currently proposed approaches using linear conditional random fields (CRF) only model local tag dependencies. In this work, we introduce a seq2seq model tailored for DA classification using: a hierarchical encoder, a novel guided attention mechanism and beam search applied to both training and inference. Compared to the state of the art our model does not require handcrafted features and is trained end-to-end. Furthermore, the proposed approach achieves an unmatched accuracy score of 85 MRDA.



There are no comments yet.


page 7


Guiding attention in Sequence-to-sequence models for Dialogue Act prediction

The task of predicting dialog acts (DA) based on conversational dialog i...

A Dual Encoder Sequence to Sequence Model for Open-Domain Dialogue Modeling

Ever since the successful application of sequence to sequence learning f...

Context-aware Neural-based Dialog Act Classification on Automatically Generated Transcriptions

This paper presents our latest investigations on dialog act (DA) classif...

Tag Recommendation by Word-Level Tag Sequence Modeling

In this paper, we transform tag recommendation into a word-based text ge...

Calibration of Encoder Decoder Models for Neural Machine Translation

We study the calibration of several state of the art neural machine tran...

Multi-level Gated Recurrent Neural Network for Dialog Act Classification

In this paper we focus on the problem of dialog act (DA) labelling. This...

Transformer-GCRF: Recovering Chinese Dropped Pronouns with General Conditional Random Fields

Pronouns are often dropped in Chinese conversations and recovering the d...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.