Fail-Safe Execution of Deep Learning based Systems through Uncertainty Monitoring

02/01/2021
by   Michael Weiss, et al.
0

Modern software systems rely on Deep Neural Networks (DNN) when processing complex, unstructured inputs, such as images, videos, natural language texts or audio signals. Provided the intractably large size of such input spaces, the intrinsic limitations of learning algorithms, and the ambiguity about the expected predictions for some of the inputs, not only there is no guarantee that DNN's predictions are always correct, but rather developers must safely assume a low, though not negligible, error probability. A fail-safe Deep Learning based System (DLS) is one equipped to handle DNN faults by means of a supervisor, capable of recognizing predictions that should not be trusted and that should activate a healing procedure bringing the DLS to a safe state. In this paper, we propose an approach to use DNN uncertainty estimators to implement such a supervisor. We first discuss the advantages and disadvantages of existing approaches to measure uncertainty for DNNs and propose novel metrics for the empirical assessment of the supervisor that rely on such approaches. We then describe our publicly available tool UNCERTAINTY-WIZARD, which allows transparent estimation of uncertainty for regular tf.keras DNNs. Lastly, we discuss a large-scale study conducted on four different subjects to empirically validate the approach, reporting the lessons-learned as guidance for software engineers who intend to monitor uncertainty for fail-safe execution of DLS.

READ FULL TEXT

page 1

page 2

research
12/14/2022

Uncertainty Quantification for Deep Neural Networks: An Empirical Comparison and Usage Guidelines

Deep Neural Networks (DNN) are increasingly used as components of larger...
research
01/12/2023

Towards Dependable Autonomous Systems Based on Bayesian Deep Learning Components

As autonomous systems increasingly rely on Deep Neural Networks (DNN) to...
research
07/21/2022

A Forgotten Danger in DNN Supervision Testing: Generating and Detecting True Ambiguity

Deep Neural Networks (DNNs) are becoming a crucial component of modern s...
research
06/04/2022

Quantifying and Using System Uncertainty in UAV Navigation

As autonomous systems increasingly rely on Deep Neural Networks (DNN) to...
research
11/02/2020

PAC Confidence Predictions for Deep Neural Network Classifiers

A key challenge for deploying deep neural networks (DNNs) in safety crit...
research
04/05/2023

Adopting Two Supervisors for Efficient Use of Large-Scale Remote Deep Neural Networks

Recent decades have seen the rise of large-scale Deep Neural Networks (D...
research
05/21/2022

Transformer-based out-of-distribution detection for clinically safe segmentation

In a clinical setting it is essential that deployed image processing sys...

Please sign up or login with your details

Forgot password? Click here to reset