Quantifying Model Uncertainty for Semantic Segmentation using Operators in the RKHS

11/03/2022
by   Rishabh Singh, et al.
0

Deep learning models for semantic segmentation are prone to poor performance in real-world applications due to the highly challenging nature of the task. Model uncertainty quantification (UQ) is one way to address this issue of lack of model trustworthiness by enabling the practitioner to know how much to trust a segmentation output. Current UQ methods in this application domain are mainly restricted to Bayesian based methods which are computationally expensive and are only able to extract central moments of uncertainty thereby limiting the quality of their uncertainty estimates. We present a simple framework for high-resolution predictive uncertainty quantification of semantic segmentation models that leverages a multi-moment functional definition of uncertainty associated with the model's feature space in the reproducing kernel Hilbert space (RKHS). The multiple uncertainty functionals extracted from this framework are defined by the local density dynamics of the model's feature space and hence automatically align themselves at the tail-regions of the intrinsic probability density function of the feature space (where uncertainty is the highest) in such a way that the successively higher order moments quantify the more uncertain regions. This leads to a significantly more accurate view of model uncertainty than conventional Bayesian methods. Moreover, the extraction of such moments is done in a single-shot computation making it much faster than Bayesian and ensemble approaches (that involve a high number of forward stochastic passes of the model to quantify its uncertainty). We demonstrate these advantages through experimental evaluations of our framework implemented over four different state-of-the-art model architectures that are trained and evaluated on two benchmark road-scene segmentation datasets (Camvid and Cityscapes).

READ FULL TEXT

page 4

page 6

page 10

page 11

research
09/22/2021

Quantifying Model Predictive Uncertainty with Perturbation Theory

We propose a framework for predictive uncertainty quantification of a ne...
research
03/02/2021

A Kernel Framework to Quantify a Model's Local Predictive Uncertainty under Data Distributional Shifts

Traditional Bayesian approaches for model uncertainty quantification rel...
research
10/29/2021

Deep Deterministic Uncertainty for Semantic Segmentation

We extend Deep Deterministic Uncertainty (DDU), a method for uncertainty...
research
01/12/2022

Ensemble-Based Experimental Design for Targeted High-Resolution Simulations to Inform Climate Models

Targeted high-resolution simulations driven by a general circulation mod...
research
01/30/2020

Towards a Kernel based Physical Interpretation of Model Uncertainty

This paper introduces a new information theoretic framework that provide...
research
11/24/2022

MP-GELU Bayesian Neural Networks: Moment Propagation by GELU Nonlinearity

Bayesian neural networks (BNNs) have been an important framework in the ...
research
11/03/2022

Robust Dependence Measure using RKHS based Uncertainty Moments and Optimal Transport

Reliable measurement of dependence between variables is essential in man...

Please sign up or login with your details

Forgot password? Click here to reset