An Interpretable Deep Hierarchical Semantic Convolutional Neural Network for Lung Nodule Malignancy Classification

06/02/2018
by   Shiwen Shen, et al.
0

While deep learning methods are increasingly being applied to tasks such as computer-aided diagnosis, these models are difficult to interpret, do not incorporate prior domain knowledge, and are often considered as a "black-box." The lack of model interpretability hinders them from being fully understood by target users such as radiologists. In this paper, we present a novel interpretable deep hierarchical semantic convolutional neural network (HSCNN) to predict whether a given pulmonary nodule observed on a computed tomography (CT) scan is malignant. Our network provides two levels of output: 1) low-level radiologist semantic features, and 2) a high-level malignancy prediction score. The low-level semantic outputs quantify the diagnostic features used by radiologists and serve to explain how the model interprets the images in an expert-driven manner. The information from these low-level tasks, along with the representations learned by the convolutional layers, are then combined and used to infer the high-level task of predicting nodule malignancy. This unified architecture is trained by optimizing a global loss function including both low- and high-level tasks, thereby learning all the parameters within a joint framework. Our experimental results using the Lung Image Database Consortium (LIDC) show that the proposed method not only produces interpretable lung cancer predictions but also achieves significantly better results compared to common 3D CNN approaches.

READ FULL TEXT

page 2

page 8

page 12

page 15

page 16

research
01/05/2020

Deep Transfer Convolutional Neural Network and Extreme Learning Machine for Lung Nodule Diagnosis on CT images

Diagnosis of benign-malignant nodules in the lung on Computed Tomography...
research
06/24/2020

Malignancy-Aware Follow-Up Volume Prediction for Lung Nodules

Follow-up serves an important role in the management of pulmonary nodule...
research
09/12/2019

Encoding High-Level Visual Attributes in Capsules for Explainable Medical Diagnoses

Deep neural networks are often called black-boxes due to their difficult...
research
03/20/2023

Integration of Radiomics and Tumor Biomarkers in Interpretable Machine Learning Models

Despite the unprecedented performance of deep neural networks (DNNs) in ...
research
04/19/2020

A Biologically Interpretable Two-stage Deep Neural Network (BIT-DNN) For Hyperspectral Imagery Classification

Spectral-spatial based deep learning models have recently proven to be e...
research
05/08/2017

High-Level Concepts for Affective Understanding of Images

This paper aims to bridge the affective gap between image content and th...
research
10/05/2022

Toward Knowledge-Driven Speech-Based Models of Depression: Leveraging Spectrotemporal Variations in Speech Vowels

Psychomotor retardation associated with depression has been linked with ...

Please sign up or login with your details

Forgot password? Click here to reset