Computational analysis of pathological image enables interpretable prediction for microsatellite instability

10/07/2020
by   Jin Zhu, et al.
0

Microsatellite instability (MSI) is associated with several tumor types and its status has become increasingly vital in guiding patient treatment decisions. However, in clinical practice, distinguishing MSI from its counterpart is challenging since the diagnosis of MSI requires additional genetic or immunohistochemical tests. In this study, interpretable pathological image analysis strategies are established to help medical experts to automatically identify MSI. The strategies only require ubiquitous Haematoxylin and eosin-stained whole-slide images and can achieve decent performance in the three cohorts collected from The Cancer Genome Atlas. The strategies provide interpretability in two aspects. On the one hand, the image-level interpretability is achieved by generating localization heat maps of important regions based on the deep learning network; on the other hand, the feature-level interpretability is attained through feature importance and pathological feature interaction analysis. More interestingly, both from the image-level and feature-level interpretability, color features and texture characteristics are shown to contribute the most to the MSI predictions. Therefore, the classification models under the proposed strategies can not only serve as an efficient tool for predicting the MSI status of patients, but also provide more insights to pathologists with clinical understanding.

READ FULL TEXT

page 15

page 16

page 17

page 19

page 21

page 22

page 23

page 24

research
02/04/2022

Stratification of carotid atheromatous plaque using interpretable deep learning methods on B-mode ultrasound images

Carotid atherosclerosis is the major cause of ischemic stroke resulting ...
research
09/07/2023

Beyond attention: deriving biologically interpretable insights from weakly-supervised multiple-instance learning models

Recent advances in attention-based multiple instance learning (MIL) have...
research
05/19/2020

ISeeU2: Visually Interpretable ICU mortality prediction using deep learning and free-text medical notes

Accurate mortality prediction allows Intensive Care Units (ICUs) to adeq...
research
11/27/2019

AdaCare: Explainable Clinical Health Status Representation Learning via Scale-Adaptive Feature Extraction and Recalibration

Deep learning-based health status representation learning and clinical p...
research
10/26/2021

A Personalized Diagnostic Generation Framework Based on Multi-source Heterogeneous Data

Personalized diagnoses have not been possible due to sear amount of data...
research
06/03/2022

Additive MIL: Intrinsic Interpretability for Pathology

Multiple Instance Learning (MIL) has been widely applied in pathology to...

Please sign up or login with your details

Forgot password? Click here to reset