Multi-Granularity Representations of Dialog

08/26/2019
by   Shikib Mehri, et al.
0

Neural models of dialog rely on generalized latent representations of language. This paper introduces a novel training procedure which explicitly learns multiple representations of language at several levels of granularity. The multi-granularity training algorithm modifies the mechanism by which negative candidate responses are sampled in order to control the granularity of learned latent representations. Strong performance gains are observed on the next utterance retrieval task using both the MultiWOZ dataset and the Ubuntu dialog corpus. Analysis significantly demonstrates that multiple granularities of representation are being learned, and that multi-granularity training facilitates better transfer to downstream tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/23/2019

Structured Fusion Networks for Dialog

Neural dialog models have exhibited strong performance, however their en...
research
11/15/2022

Hierarchical Pronunciation Assessment with Multi-Aspect Attention

Automatic pronunciation assessment is a major component of a computer-as...
research
08/27/2021

Code-switched inspired losses for generic spoken dialog representations

Spoken dialog systems need to be able to handle both multiple languages ...
research
09/07/2023

Word segmentation granularity in Korean

This paper describes word segmentation granularity in Korean language pr...
research
04/07/2019

Unsupervised Dialog Structure Learning

Learning a shared dialog structure from a set of task-oriented dialogs i...
research
01/26/2023

On granularity of prosodic representations in expressive text-to-speech

In expressive speech synthesis it is widely adopted to use latent prosod...
research
06/24/2019

Decomposable Neural Paraphrase Generation

Paraphrasing exists at different granularity levels, such as lexical lev...

Please sign up or login with your details

Forgot password? Click here to reset