Local Explanation of Dialogue Response Generation

by   Yi-Lin Tuan, et al.

In comparison to the interpretation of classification models, the explanation of sequence generation models is also an important problem, however it has seen little attention. In this work, we study model-agnostic explanations of a representative text generation task – dialogue response generation. Dialog response generation is challenging with its open-ended sentences and multiple acceptable responses. To gain insights into the reasoning process of a generation model, we propose anew method, local explanation of response generation (LERG) that regards the explanations as the mutual interaction of segments in input and output sentences. LERG views the sequence prediction as uncertainty estimation of a human response and then creates explanations by perturbing the input and calculating the certainty change over the human response. We show that LERG adheres to desired properties of explanations for text generation including unbiased approximation, consistency and cause identification. Empirically, our results show that our method consistently improves other widely used methods on proposed automatic- and human- evaluation metrics for this new task by 4.4-12.8 extract both explicit and implicit relations between input and output segments.


page 1

page 2

page 3

page 4


Diagnostics-Guided Explanation Generation

Explanations shed light on a machine learning model's rationales and can...

Evaluating Dialogue Generation Systems via Response Selection

Existing automatic evaluation metrics for open-domain dialogue response ...

Conditioned Text Generation with Transfer for Closed-Domain Dialogue Systems

Scarcity of training data for task-oriented dialogue systems is a well k...

Quality Evaluation of the Low-Resource Synthetically Generated Code-Mixed Hinglish Text

In this shared task, we seek the participating teams to investigate the ...

Scalable Sentiment for Sequence-to-sequence Chatbot Response with Performance Analysis

Conventional seq2seq chatbot models only try to find the sentences with ...

N-best Response-based Analysis of Contradiction-awareness in Neural Response Generation Models

Avoiding the generation of responses that contradict the preceding conte...

Code Repositories

Please sign up or login with your details

Forgot password? Click here to reset