Two-Level Supervised Contrastive Learning for Response Selection in Multi-Turn Dialogue

03/01/2022
by   Wentao Zhang, et al.
0

Selecting an appropriate response from many candidates given the utterances in a multi-turn dialogue is the key problem for a retrieval-based dialogue system. Existing work formalizes the task as matching between the utterances and a candidate and uses the cross-entropy loss in learning of the model. This paper applies contrastive learning to the problem by using the supervised contrastive loss. In this way, the learned representations of positive examples and representations of negative examples can be more distantly separated in the embedding space, and the performance of matching can be enhanced. We further develop a new method for supervised contrastive learning, referred to as two-level supervised contrastive learning, and employ the method in response selection in multi-turn dialogue. Our method exploits two techniques: sentence token shuffling (STS) and sentence re-ordering (SR) for supervised contrastive learning. Experimental results on three benchmark datasets demonstrate that the proposed method significantly outperforms the contrastive learning baseline and the state-of-the-art methods for the task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/19/2021

Small Changes Make Big Differences: Improving Multi-turn Response Selection in Dialogue Systems via Fine-Grained Contrastive Learning

Retrieve-based dialogue response selection aims to find a proper respons...
research
04/15/2022

DialAug: Mixing up Dialogue Contexts in Contrastive Learning for Robust Conversational Modeling

Retrieval-based conversational systems learn to rank response candidates...
research
09/16/2020

Group-wise Contrastive Learning for Neural Dialogue Generation

Neural dialogue response generation has gained much popularity in recent...
research
02/16/2023

CluCDD:Contrastive Dialogue Disentanglement via Clustering

A huge number of multi-participant dialogues happen online every day, wh...
research
11/14/2022

Imagination is All You Need! Curved Contrastive Learning for Abstract Sequence Modeling Utilized on Long Short-Term Dialogue Planning

Motivated by the entailment property of multi-turn dialogues through con...
research
07/05/2022

Block-SCL: Blocking Matters for Supervised Contrastive Learning in Product Matching

Product matching is a fundamental step for the global understanding of c...
research
12/01/2022

IRRGN: An Implicit Relational Reasoning Graph Network for Multi-turn Response Selection

The task of response selection in multi-turn dialogue is to find the bes...

Please sign up or login with your details

Forgot password? Click here to reset