A Quantitative Comparison between Shannon and Tsallis Havrda Charvat Entropies Applied to Cancer Outcome Prediction

03/22/2022
by   Thibaud Brochet, et al.
0

In this paper, we propose to quantitatively compare loss functions based on parameterized Tsallis-Havrda-Charvat entropy and classical Shannon entropy for the training of a deep network in the case of small datasets which are usually encountered in medical applications. Shannon cross-entropy is widely used as a loss function for most neural networks applied to the segmentation, classification and detection of images. Shannon entropy is a particular case of Tsallis-Havrda-Charvat entropy. In this work, we compare these two entropies through a medical application for predicting recurrence in patients with head-neck and lung cancers after treatment. Based on both CT images and patient information, a multitask deep neural network is proposed to perform a recurrence prediction task using cross-entropy as a loss function and an image reconstruction task. Tsallis-Havrda-Charvat cross-entropy is a parameterized cross entropy with the parameter α. Shannon entropy is a particular case of Tsallis-Havrda-Charvat entropy for α = 1. The influence of this parameter on the final prediction results is studied. In this paper, the experiments are conducted on two datasets including in total 580 patients, of whom 434 suffered from head-neck cancers and 146 from lung cancers. The results show that Tsallis-Havrda-Charvat entropy can achieve better performance in terms of prediction accuracy with some values of α.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 9

page 10

page 11

research
06/28/2022

On the Rényi Cross-Entropy

The Rényi cross-entropy measure between two distributions, a generalizat...
research
04/12/2021

Deep learning using Havrda-Charvat entropy for classification of pulmonary endomicroscopy

Pulmonary optical endomicroscopy (POE) is an imaging technology in real ...
research
04/27/2021

A Dual Process Model for Optimizing Cross Entropy in Neural Networks

Minimizing cross-entropy is a widely used method for training artificial...
research
05/10/2021

Generalized Jensen-Shannon Divergence Loss for Learning with Noisy Labels

We propose two novel loss functions based on Jensen-Shannon divergence f...
research
04/17/2019

Aggregation Cross-Entropy for Sequence Recognition

In this paper, we propose a novel method, aggregation cross-entropy (ACE...
research
03/13/2023

Collision Cross-entropy and EM Algorithm for Self-labeled Classification

We propose "collision cross-entropy" as a robust alternative to the Shan...
research
12/04/2018

Set Cross Entropy: Likelihood-based Permutation Invariant Loss Function for Probability Distributions

We propose a permutation-invariant loss function designed for the neural...

Please sign up or login with your details

Forgot password? Click here to reset