Evaluating Explainable AI on a Multi-Modal Medical Imaging Task: Can Existing Algorithms Fulfill Clinical Requirements?

03/12/2022
by   Weina Jin, et al.
18

Being able to explain the prediction to clinical end-users is a necessity to leverage the power of artificial intelligence (AI) models for clinical decision support. For medical images, a feature attribution map, or heatmap, is the most common form of explanation that highlights important features for AI models' prediction. However, it is unknown how well heatmaps perform on explaining decisions on multi-modal medical images, where each image modality or channel visualizes distinct clinical information of the same underlying biomedical phenomenon. Understanding such modality-dependent features is essential for clinical users' interpretation of AI decisions. To tackle this clinically important but technically ignored problem, we propose the modality-specific feature importance (MSFI) metric. It encodes clinical image and explanation interpretation patterns of modality prioritization and modality-specific feature localization. We conduct a clinical requirement-grounded, systematic evaluation using computational methods and a clinician user study. Results show that the examined 16 heatmap algorithms failed to fulfill clinical requirements to correctly indicate AI model decision process or decision quality. The evaluation and MSFI metric can guide the design and selection of XAI algorithms to meet clinical requirements on multi-modal explanation.

READ FULL TEXT

page 3

page 5

page 7

research
07/11/2021

One Map Does Not Fit All: Evaluating Saliency Map Explanation on Multi-Modal Medical Images

Being able to explain the prediction to clinical end-users is a necessit...
research
02/16/2022

Guidelines and evaluation for clinical explainable AI on medical image analysis

Explainable artificial intelligence (XAI) is essential for enabling clin...
research
10/09/2020

Explaining Clinical Decision Support Systems in Medical Imaging using Cycle-Consistent Activation Maximization

Clinical decision support using deep neural networks has become a topic ...
research
07/06/2022

Towards the Use of Saliency Maps for Explaining Low-Quality Electrocardiograms to End Users

When using medical images for diagnosis, either by clinicians or artific...
research
03/03/2023

Robust Detection Outcome: A Metric for Pathology Detection in Medical Images

Detection of pathologies is a fundamental task in medical imaging and th...
research
11/17/2020

HiResCAM: Explainable Multi-Organ Multi-Abnormality Prediction in 3D Medical Images

Understanding model predictions is critical in healthcare, to facilitate...
research
02/06/2022

CheXstray: Real-time Multi-Modal Data Concordance for Drift Detection in Medical Imaging AI

Rapidly expanding Clinical AI applications worldwide have the potential ...

Please sign up or login with your details

Forgot password? Click here to reset