Explanation as a process: user-centric construction of multi-level and multi-modal explanations

10/07/2021
by   Bettina Finzel, et al.
0

In the last years, XAI research has mainly been concerned with developing new technical approaches to explain deep learning models. Just recent research has started to acknowledge the need to tailor explanations to different contexts and requirements of stakeholders. Explanations must not only suit developers of models, but also domain experts as well as end users. Thus, in order to satisfy different stakeholders, explanation methods need to be combined. While multi-modal explanations have been used to make model predictions more transparent, less research has focused on treating explanation as a process, where users can ask for information according to the level of understanding gained at a certain point in time. Consequently, an opportunity to explore explanations on different levels of abstraction should be provided besides multi-modal explanations. We present a process-based approach that combines multi-level and multi-modal explanations. The user can ask for textual explanations or visualizations through conversational interaction in a drill-down manner. We use Inductive Logic Programming, an interpretable machine learning approach, to learn a comprehensible model. Further, we present an algorithm that creates an explanatory tree for each example for which a classifier decision is to be explained. The explanatory tree can be navigated by the user to get answers of different levels of detail. We provide a proof-of-concept implementation for concepts induced from a semantic net about living beings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/04/2022

ExAID: A Multimodal Explanation Framework for Computer-Aided Diagnosis of Skin Lesions

One principal impediment in the successful deployment of AI-based Comput...
research
08/04/2022

Multi-modal volumetric concept activation to explain detection and classification of metastatic prostate cancer on PSMA-PET/CT

Explainable artificial intelligence (XAI) is increasingly used to analyz...
research
06/30/2022

Personalized Showcases: Generating Multi-Modal Explanations for Recommendations

Existing explanation models generate only text for recommendations but s...
research
06/15/2021

Generating Contrastive Explanations for Inductive Logic Programming Based on a Near Miss Approach

In recent research, human-understandable explanations of machine learnin...
research
12/09/2020

Driving Behavior Explanation with Multi-level Fusion

In this era of active development of autonomous vehicles, it becomes cru...
research
11/15/2021

LIMEcraft: Handcrafted superpixel selection and inspection for Visual eXplanations

The increased interest in deep learning applications, and their hard-to-...
research
09/11/2023

A Co-design Study for Multi-Stakeholder Job Recommender System Explanations

Recent legislation proposals have significantly increased the demand for...

Please sign up or login with your details

Forgot password? Click here to reset