Imagination is All You Need! Curved Contrastive Learning for Abstract Sequence Modeling Utilized on Long Short-Term Dialogue Planning

11/14/2022
by   Justus-Jonas Erker, et al.
0

Motivated by the entailment property of multi-turn dialogues through contrastive learning sentence embeddings, we introduce a novel technique, Curved Contrastive Learning (CCL), for generating semantically meaningful and conversational graph curved utterance embeddings that can be compared using cosine similarity. The resulting bi-encoder models can guide transformers as a response ranking model towards a goal in a zero-shot fashion by projecting the goal utterance and the corresponding reply candidates into a latent space. Here the cosine similarity indicates the distance/reachability of a candidate utterance towards the corresponding goal which we define as curved space. Furthermore, we explore how these forward-entailing language representations can be utilized for assessing the likelihood of sequences by the entailment strength i.e. through the cosine similarity of its individual members (encoded separately) as an emergent property in the curved space. This allows us to imagine the likelihood of future patterns in dialogues, specifically by ordering/identifying future goal utterances that are multiple turns away, given a dialogue context. As part of our analysis, we investigate characteristics that make conversations (un)plannable and find strong evidence of planning capability over multiple turns (in 61.56% over 3 turns) in conversations from the DailyDialog dataset. Finally, we will show how we can exploit the curved property to rank one million utterance context pairs, in terms of GPU computation time over 7 million times faster than DialogRPT, while being in average 2.8% qualitatively superior for sequences longer than 2 turns.

READ FULL TEXT
research
06/01/2019

Multi-Turn Beam Search for Neural Dialogue Modeling

In neural dialogue modeling, a neural network is trained to predict the ...
research
04/04/2019

Improving Dialogue State Tracking by Discerning the Relevant Context

A typical conversation comprises of multiple turns between participants ...
research
03/01/2022

Two-Level Supervised Contrastive Learning for Response Selection in Multi-Turn Dialogue

Selecting an appropriate response from many candidates given the utteran...
research
05/09/2023

Dialogue Planning via Brownian Bridge Stochastic Process for Goal-directed Proactive Dialogue

Goal-directed dialogue systems aim to proactively reach a pre-determined...
research
09/02/2019

Modeling Long-Range Context for Concurrent Dialogue Acts Recognition

In dialogues, an utterance is a chain of consecutive sentences produced ...
research
10/16/2017

A retrieval-based dialogue system utilizing utterance and context embeddings

Finding semantically rich and computer-understandable representations fo...

Please sign up or login with your details

Forgot password? Click here to reset