Quantifying the uncertainty of neural networks using Monte Carlo dropout for deep learning based quantitative MRI

12/02/2021
by   Mehmet Yigit Avci, et al.
0

Dropout is conventionally used during the training phase as regularization method and for quantifying uncertainty in deep learning. We propose to use dropout during training as well as inference steps, and average multiple predictions to improve the accuracy, while reducing and quantifying the uncertainty. The results are evaluated for fractional anisotropy (FA) and mean diffusivity (MD) maps which are obtained from only 3 direction scans. With our method, accuracy can be improved significantly compared to network outputs without dropout, especially when the training dataset is small. Moreover, confidence maps are generated which may aid in diagnosis of unseen pathology or artifacts.

READ FULL TEXT

page 5

page 6

page 7

page 8

research
08/06/2020

Notes on the Behavior of MC Dropout

Among the various options to estimate uncertainty in deep neural network...
research
10/06/2021

A Survey on Evidential Deep Learning For Single-Pass Uncertainty Estimation

Popular approaches for quantifying predictive uncertainty in deep neural...
research
01/31/2020

Fast Monte Carlo Dropout and Error Correction for Radio Transmitter Classification

Monte Carlo dropout may effectively capture model uncertainty in deep le...
research
07/28/2021

Uncertainty-Aware Credit Card Fraud Detection Using Deep Learning

Countless research works of deep neural networks (DNNs) in the task of c...
research
07/24/2021

μDARTS: Model Uncertainty-Aware Differentiable Architecture Search

We present a Model Uncertainty-aware Differentiable ARchiTecture Search ...

Please sign up or login with your details

Forgot password? Click here to reset