ExAID: A Multimodal Explanation Framework for Computer-Aided Diagnosis of Skin Lesions

01/04/2022
by   Adriano Lucieri, et al.
30

One principal impediment in the successful deployment of AI-based Computer-Aided Diagnosis (CAD) systems in clinical workflows is their lack of transparent decision making. Although commonly used eXplainable AI methods provide some insight into opaque algorithms, such explanations are usually convoluted and not readily comprehensible except by highly trained experts. The explanation of decisions regarding the malignancy of skin lesions from dermoscopic images demands particular clarity, as the underlying medical problem definition is itself ambiguous. This work presents ExAID (Explainable AI for Dermatology), a novel framework for biomedical image analysis, providing multi-modal concept-based explanations consisting of easy-to-understand textual explanations supplemented by visual maps justifying the predictions. ExAID relies on Concept Activation Vectors to map human concepts to those learnt by arbitrary Deep Learning models in latent space, and Concept Localization Maps to highlight concepts in the input space. This identification of relevant concepts is then used to construct fine-grained textual explanations supplemented by concept-wise location information to provide comprehensive and coherent multi-modal explanations. All information is comprehensively presented in a diagnostic interface for use in clinical routines. An educational mode provides dataset-level explanation statistics and tools for data and model exploration to aid medical research and education. Through rigorous quantitative and qualitative evaluation of ExAID, we show the utility of multi-modal explanations for CAD-assisted scenarios even in case of wrong predictions. We believe that ExAID will provide dermatologists an effective screening tool that they both understand and trust. Moreover, it will be the basis for similar applications in other biomedical imaging fields.

READ FULL TEXT

page 5

page 6

page 7

page 8

page 12

research
03/02/2021

Deep Learning Based Decision Support for Medicine – A Case Study on Skin Cancer Diagnosis

Early detection of skin cancers like melanoma is crucial to ensure high ...
research
10/07/2021

Explanation as a process: user-centric construction of multi-level and multi-modal explanations

In the last years, XAI research has mainly been concerned with developin...
research
05/04/2020

Explaining AI-based Decision Support Systems using Concept Localization Maps

Human-centric explainability of AI-based Decision Support Systems (DSS) ...
research
11/26/2020

Achievements and Challenges in Explaining Deep Learning based Computer-Aided Diagnosis Systems

Remarkable success of modern image-based AI methods and the resulting in...
research
12/18/2019

Clusters in Explanation Space: Inferring disease subtypes from model explanations

Identification of disease subtypes and corresponding biomarkers can subs...
research
02/01/2023

SkinCon: A skin disease dataset densely annotated by domain experts for fine-grained model debugging and analysis

For the deployment of artificial intelligence (AI) in high-risk settings...
research
05/05/2020

On Interpretability of Deep Learning based Skin Lesion Classifiers using Concept Activation Vectors

Deep learning based medical image classifiers have shown remarkable prow...

Please sign up or login with your details

Forgot password? Click here to reset