Behaviour Trees for Conversational Explanation Experiences

11/11/2022
by   Anjana Wijekoon, et al.
0

Explainable AI (XAI) has the potential to make a significant impact on building trust and improving the satisfaction of users who interact with an AI system for decision-making. There is an abundance of explanation techniques in literature to address this need. Recently, it has been shown that a user is likely to have multiple explanation needs that should be addressed by a constellation of explanation techniques which we refer to as an explanation strategy. This paper focuses on how users interact with an XAI system to fulfil these multiple explanation needs satisfied by an explanation strategy. For this purpose, the paper introduces the concept of an "explanation experience" - as episodes of user interactions captured by the XAI system when explaining the decisions made by its AI system. In this paper, we explore how to enable and capture explanation experiences through conversational interactions. We model the interactive explanation experience as a dialogue model. Specifically, Behaviour Trees (BT) are used to model conversational pathways and chatbot behaviours. A BT dialogue model is easily personalised by dynamically extending or modifying it to attend to different user needs and explanation strategies. An evaluation with a real-world use case shows that BTs have a number of properties that lend naturally to modelling and capturing explanation experiences; as compared to traditionally used state transition models.

READ FULL TEXT
research
06/09/2023

Interactive Explanation with Varying Level of Details in an Explainable Scientific Literature Recommender System

Explainable recommender systems (RS) have traditionally followed a one-s...
research
07/31/2023

Towards a Comprehensive Human-Centred Evaluation Framework for Explainable AI

While research on explainable AI (XAI) is booming and explanation techni...
research
07/18/2023

Identifying Explanation Needs of End-users: Applying and Extending the XAI Question Bank

Explanations in XAI are typically developed by AI experts and focus on a...
research
09/06/2022

Explaining Machine Learning Models in Natural Conversations: Towards a Conversational XAI Agent

The goal of Explainable AI (XAI) is to design methods to provide insight...
research
03/25/2022

A Meta Survey of Quality Evaluation Criteria in Explanation Methods

Explanation methods and their evaluation have become a significant issue...
research
11/07/2018

Contrastive Explanation: A Structural-Model Approach

The topic of causal explanation in artificial intelligence has gathered ...
research
05/16/2023

ConvXAI: Delivering Heterogeneous AI Explanations via Conversations to Support Human-AI Scientific Writing

While various AI explanation (XAI) methods have been proposed to interpr...

Please sign up or login with your details

Forgot password? Click here to reset