DeepAI AI Chat
Log In Sign Up

Challenging Instances are Worth Learning: Generating Valuable Negative Samples for Response Selection Training

by   Yao Qiu, et al.

Retrieval-based chatbot selects the appropriate response from candidates according to the context, which heavily depends on a response selection module. A response selection module is generally a scoring model to evaluate candidates and is usually trained on the annotated positive response and sampled negative responses. Sampling negative responses lead to two risks: a). The sampled negative instances, especially that from random sampling methods, are mostly irrelevant to the dialogue context and too easy to be fitted at the training stage while causing a weak model in the real scenario. b). The so-called negative instances may be positive, which is known as the fake negative problem. To address the above issue, we employ pre-trained language models, such as the DialoGPT to construct more challenging negative instances to enhance the model robustness. Specifically, we provide garbled context to the pre-trained model to generate responses and filter the fake negative ones. In this way, our negative instances are fluent, context-related, and more challenging for the model to learn, while can not be positive. Extensive experiments show that our method brings significant and stable improvements on the dialogue response selection capacity.


page 1

page 2

page 3

page 4


Pneg: Prompt-based Negative Response Generation for Dialogue Response Selection Task

In retrieval-based dialogue systems, a response selection model acts as ...

Global-Selector: A New Benchmark Dataset and Model Architecture for Multi-turn Response Selection

As an essential component of dialogue systems, multi-turn response selec...

Negative Training for Neural Dialogue Response Generation

Although deep learning models have brought tremendous advancements to th...

Strategy of the Negative Sampling for Training Retrieval-Based Dialogue Systems

The article describes the new approach for quality improvement of automa...

Diversifying Neural Dialogue Generation via Negative Distillation

Generative dialogue models suffer badly from the generic response proble...

Context Matters in Semantically Controlled Language Generation for Task-oriented Dialogue Systems

This work combines information about the dialogue history encoded by pre...

Transformer-Based Conditioned Variational Autoencoder for Dialogue Generation

In human dialogue, a single query may elicit numerous appropriate respon...