Using Monte Carlo dropout and bootstrap aggregation for uncertainty estimation in radiation therapy dose prediction with deep learning neural networks

11/01/2020
by   Dan Nguyen, et al.
5

Recently, artificial intelligence technologies and algorithms have become a major focus for advancements in treatment planning for radiation therapy. As these are starting to become incorporated into the clinical workflow, a major concern from clinicians is not whether the model is accurate, but whether the model can express to a human operator when it does not know if its answer is correct. We propose to use Monte Carlo dropout (MCDO) and the bootstrap aggregation (bagging) technique on deep learning models to produce uncertainty estimations for radiation therapy dose prediction. We show that both models are capable of generating a reasonable uncertainty map, and, with our proposed scaling technique, creating interpretable uncertainties and bounds on the prediction and any relevant metrics. Performance-wise, bagging provides statistically significant reduced loss value and errors in most of the metrics investigated in this study. The addition of bagging was able to further reduce errors by another 0.34 to the baseline framework. Overall, the bagging framework provided significantly lower MAE of 2.62, as opposed to the baseline framework's MAE of 2.87. The usefulness of bagging, from solely a performance standpoint, does highly depend on the problem and the acceptable predictive error, and its high upfront computational cost during training should be factored in to deciding whether it is advantageous to use it. In terms of deployment with uncertainty estimations turned on, both frameworks offer the same performance time of about 12 seconds. As an ensemble-based metaheuristic, bagging can be used with existing machine learning architectures to improve stability and performance, and MCDO can be applied to any deep learning models that have dropout as part of their architecture.

READ FULL TEXT

page 1

page 8

page 9

page 10

page 12

page 13

page 15

page 16

research
07/29/2018

Efficient Uncertainty Estimation for Semantic Segmentation in Videos

Uncertainty estimation in deep learning becomes more important recently....
research
01/31/2020

Fast Monte Carlo Dropout and Error Correction for Radio Transmitter Classification

Monte Carlo dropout may effectively capture model uncertainty in deep le...
research
12/02/2021

Quantifying the uncertainty of neural networks using Monte Carlo dropout for deep learning based quantitative MRI

Dropout is conventionally used during the training phase as regularizati...
research
07/20/2020

Monte Carlo Dropout Ensembles for Robust Illumination Estimation

Computational color constancy is a preprocessing step used in many camer...
research
04/04/2023

Uncertainty estimation in Deep Learning for Panoptic segmentation

As deep learning-based computer vision algorithms continue to improve an...
research
11/29/2022

UQ-ARMED: Uncertainty quantification of adversarially-regularized mixed effects deep learning for clustered non-iid data

This work demonstrates the ability to produce readily interpretable stat...
research
06/10/2019

Evaluating aleatoric and epistemic uncertainties of time series deep learning models for soil moisture predictions

Soil moisture is an important variable that determines floods, vegetatio...

Please sign up or login with your details

Forgot password? Click here to reset