Second-Moment Loss: A Novel Regression Objective for Improved Uncertainties

12/23/2020
by   Joachim Sicking, et al.
0

Quantification of uncertainty is one of the most promising approaches to establish safe machine learning. Despite its importance, it is far from being generally solved, especially for neural networks. One of the most commonly used approaches so far is Monte Carlo dropout, which is computationally cheap and easy to apply in practice. However, it can underestimate the uncertainty. We propose a new objective, referred to as second-moment loss (SML), to address this issue. While the full network is encouraged to model the mean, the dropout networks are explicitly used to optimize the model variance. We analyze the performance of the new objective on various toy and UCI regression datasets. Comparing to the state-of-the-art of deep ensembles, SML leads to comparable prediction accuracies and uncertainty estimates while only requiring a single model. Under distribution shift, we observe moderate improvements. From a safety perspective also the study of worst-case uncertainties is crucial. In this regard we improve considerably. Finally, we show that SML can be successfully applied to SqueezeDet, a modern object detection network. We improve on its uncertainty-related scores while not deteriorating regression quality. As a side result, we introduce an intuitive Wasserstein distance-based uncertainty measure that is non-saturating and thus allows to resolve quality differences between any two uncertainty estimates.

READ FULL TEXT
research
01/07/2021

A Novel Regression Loss for Non-Parametric Uncertainty Optimization

Quantification of uncertainty is one of the most promising approaches to...
research
04/09/2019

Novel Uncertainty Framework for Deep Learning Ensembles

Deep neural networks have become the default choice for many of the mach...
research
04/04/2023

Uncertainty estimation in Deep Learning for Panoptic segmentation

As deep learning-based computer vision algorithms continue to improve an...
research
03/06/2020

Dropout Strikes Back: Improved Uncertainty Estimation via Diversity Sampled Implicit Ensembles

Modern machine learning models usually do not extrapolate well, i.e., th...
research
07/20/2020

Monte Carlo Dropout Ensembles for Robust Illumination Estimation

Computational color constancy is a preprocessing step used in many camer...
research
10/31/2018

Understanding Deep Neural Networks through Input Uncertainties

Techniques for understanding the functioning of complex machine learning...
research
09/16/2021

Machine-Learned HASDM Model with Uncertainty Quantification

The first thermospheric neutral mass density model with robust and relia...

Please sign up or login with your details

Forgot password? Click here to reset