ScatterUQ: Interactive Uncertainty Visualizations for Multiclass Deep Learning Problems

08/08/2023
by   Harry Li, et al.
0

Recently, uncertainty-aware deep learning methods for multiclass labeling problems have been developed that provide calibrated class prediction probabilities and out-of-distribution (OOD) indicators, letting machine learning (ML) consumers and engineers gauge a model's confidence in its predictions. However, this extra neural network prediction information is challenging to scalably convey visually for arbitrary data sources under multiple uncertainty contexts. To address these challenges, we present ScatterUQ, an interactive system that provides targeted visualizations to allow users to better understand model performance in context-driven uncertainty settings. ScatterUQ leverages recent advances in distance-aware neural networks, together with dimensionality reduction techniques, to construct robust, 2-D scatter plots explaining why a model predicts a test example to be (1) in-distribution and of a particular class, (2) in-distribution but unsure of the class, and (3) out-of-distribution. ML consumers and engineers can visually compare the salient features of test samples with training examples through the use of a “hover callback” to understand model uncertainty performance and decide follow up courses of action. We demonstrate the effectiveness of ScatterUQ to explain model uncertainty for a multiclass image classification on a distance-aware neural network trained on Fashion-MNIST and tested on Fashion-MNIST (in distribution) and MNIST digits (out of distribution), as well as a deep learning model for a cyber dataset. We quantitatively evaluate dimensionality reduction techniques to optimize our contextually driven UQ visualizations. Our results indicate that the ScatterUQ system should scale to arbitrary, multiclass datasets. Our code is available at https://github.com/mit-ll-responsible-ai/equine-webapp

READ FULL TEXT

page 1

page 4

research
10/22/2022

NeuroMapper: In-browser Visualizer for Neural Network Training

We present our ongoing work NeuroMapper, an in-browser visualization too...
research
08/20/2019

Density estimation in representation space to predict model uncertainty

Deep learning models frequently make incorrect predictions with high con...
research
07/14/2022

BayesCap: Bayesian Identity Cap for Calibrated Uncertainty in Frozen Neural Networks

High-quality calibrated uncertainty estimates are crucial for numerous r...
research
07/14/2020

Hands-on Bayesian Neural Networks – a Tutorial for Deep Learning Users

Modern deep learning methods have equipped researchers and engineers wit...
research
02/10/2020

Explaining Explanations: Axiomatic Feature Interactions for Deep Networks

Recent work has shown great promise in explaining neural network behavio...
research
05/30/2022

Going Beyond One-Hot Encoding in Classification: Can Human Uncertainty Improve Model Performance?

Technological and computational advances continuously drive forward the ...
research
07/05/2023

Distance Preserving Machine Learning for Uncertainty Aware Accelerator Capacitance Predictions

Providing accurate uncertainty estimations is essential for producing re...

Please sign up or login with your details

Forgot password? Click here to reset