Enhancing Dialogue Generation via Multi-Level Contrastive Learning

09/19/2020
by   Xin Li, et al.
0

Most of the existing works for dialogue generation are data-driven models trained directly on corpora crawled from websites. They mainly focus on improving the model architecture to produce better responses but pay little attention to considering the quality of the training data contrastively. In this paper, we propose a multi-level contrastive learning paradigm to model the fine-grained quality of the responses with respect to the query. A Rank-aware Calibration (RC) network is designed to construct the multi-level contrastive optimization objectives. Since these objectives are calculated based on the sentence level, which may erroneously encourage/suppress the generation of uninformative/informative words. To tackle this incidental issue, on one hand, we design an exquisite token-level strategy for estimating the instance loss more accurately. On the other hand, we build a Knowledge Inference (KI) component to capture the keyword knowledge from the reference during training and exploit such information to encourage the generation of informative words. We evaluate the proposed model on a carefully annotated dialogue dataset and the results suggest that our model can generate more relevant and diverse responses compared to the baseline models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/20/2020

Controlling Dialogue Generation with Semantic Exemplars

Dialogue systems pretrained with large language models generate locally ...
research
07/30/2022

Dynamically Retrieving Knowledge via Query Generation for informative dialogue response

Knowledge-driven dialogue generation has recently made remarkable breakt...
research
09/06/2023

Promoting Open-domain Dialogue Generation through Learning Pattern Information between Contexts and Responses

Recently, utilizing deep neural networks to build the opendomain dialogu...
research
09/25/2020

Focus-Constrained Attention Mechanism for CVAE-based Response Generation

To model diverse responses for a given post, one promising way is to int...
research
05/05/2022

Diversifying Neural Dialogue Generation via Negative Distillation

Generative dialogue models suffer badly from the generic response proble...
research
04/06/2020

Grayscale Data Construction and Multi-Level Ranking Objective for Dialogue Response Selection

Response selection plays a vital role in building retrieval-based conver...
research
02/27/2023

Revisit Out-Of-Vocabulary Problem for Slot Filling: A Unified Contrastive Frameword with Multi-level Data Augmentations

In real dialogue scenarios, the existing slot filling model, which tends...

Please sign up or login with your details

Forgot password? Click here to reset