Explaining Clinical Decision Support Systems in Medical Imaging using Cycle-Consistent Activation Maximization

10/09/2020
by   Alexander Katzmann, et al.
0

Clinical decision support using deep neural networks has become a topic of steadily growing interest. While recent work has repeatedly demonstrated that deep learning offers major advantages for medical image classification over traditional methods, clinicians are often hesitant to adopt the technology because its underlying decision-making process is considered to be intransparent and difficult to comprehend. In recent years, this has been addressed by a variety of approaches that have successfully contributed to providing deeper insight. Most notably, additive feature attribution methods are able to propagate decisions back into the input space by creating a saliency map which allows the practitioner to "see what the network sees." However, the quality of the generated maps can become poor and the images noisy if only limited data is available - a typical scenario in clinical contexts. We propose a novel decision explanation scheme based on CycleGAN activation maximization which generates high-quality visualizations of classifier decisions even in smaller data sets. We conducted a user study in which these visualizations significantly outperformed existing methods on the LIDC dataset for lung lesion malignancy classification. With our approach we make a significant contribution to a better understanding of clinical decision support systems based on deep neural networks and thus aim to foster overall clinical acceptance.

READ FULL TEXT

page 18

page 24

page 35

page 36

page 37

page 39

page 40

page 41

research
03/12/2022

Evaluating Explainable AI on a Multi-Modal Medical Imaging Task: Can Existing Algorithms Fulfill Clinical Requirements?

Being able to explain the prediction to clinical end-users is a necessit...
research
07/11/2021

One Map Does Not Fit All: Evaluating Saliency Map Explanation on Multi-Modal Medical Images

Being able to explain the prediction to clinical end-users is a necessit...
research
11/01/2021

Transparency of Deep Neural Networks for Medical Image Analysis: A Review of Interpretability Methods

Artificial Intelligence has emerged as a useful aid in numerous clinical...
research
07/26/2022

Explaining Deep Neural Networks for Point Clouds using Gradient-based Visualisations

Explaining decisions made by deep neural networks is a rapidly advancing...
research
03/01/2023

A Deep Neural Architecture for Harmonizing 3-D Input Data Analysis and Decision Making in Medical Imaging

Harmonizing the analysis of data, especially of 3-D image volumes, consi...
research
12/18/2018

Interactive Naming for Explaining Deep Neural Networks: A Formative Study

We consider the problem of explaining the decisions of deep neural netwo...
research
04/03/2020

Interpreting Medical Image Classifiers by Optimization Based Counterfactual Impact Analysis

Clinical applicability of automated decision support systems depends on ...

Please sign up or login with your details

Forgot password? Click here to reset