Relevance-Promoting Language Model for Short-Text Conversation

11/26/2019
by   Xin Li, et al.
0

Despite the effectiveness of sequence-to-sequence framework on the task of Short-Text Conversation (STC), the issue of under-exploitation of training data (i.e., the supervision signals from query text is ignored) still remains unresolved. Also, the adopted maximization-based decoding strategies, inclined to generating the generic responses or responses with repetition, are unsuited to the STC task. In this paper, we propose to formulate the STC task as a language modeling problem and tailor-make a training strategy to adapt a language model for response generation. To enhance generation performance, we design a relevance-promoting transformer language model, which performs additional supervised source attention after the self-attention to increase the importance of informative query tokens in calculating the token-level representation. The model further refines the query representation with relevance clues inferred from its multiple references during training. In testing, we adopt a randomization-over-maximization strategy to reduce the generation of generic responses. Experimental results on a large Chinese STC dataset demonstrate the superiority of the proposed model on relevance metrics and diversity metrics.[%s]

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/23/2018

Self-Attention-Based Message-Relevant Response Generation for Neural Conversation Model

Using a sequence-to-sequence framework, many neural conversation models ...
research
01/11/2017

Generating High-Quality and Informative Conversation Responses with Sequence-to-Sequence Models

Sequence-to-sequence models have been applied to the conversation respon...
research
11/14/2018

Generating Multiple Diverse Responses for Short-Text Conversation

Neural generative models have become popular and achieved promising perf...
research
02/28/2019

Jointly Optimizing Diversity and Relevance in Neural Response Generation

Although recent neural conversation models have shown great potential, t...
research
04/30/2020

EnsembleGAN: Adversarial Learning for Retrieval-Generation Ensemble Model on Short-Text Conversation

Generating qualitative responses has always been a challenge for human-c...
research
08/21/2023

Can Language Models Learn to Listen?

We present a framework for generating appropriate facial responses from ...
research
05/06/2023

Refining the Responses of LLMs by Themselves

In this paper, we propose a simple yet efficient approach based on promp...

Please sign up or login with your details

Forgot password? Click here to reset