DeepAI
Log In Sign Up

Getting a CLUE: A Method for Explaining Uncertainty Estimates

06/11/2020
by   Javier Antoran, et al.
0

Both uncertainty estimation and interpretability are important factors for trustworthy machine learning systems. However, there is little work at the intersection of these two areas. We address this gap by proposing a novel method for interpreting uncertainty estimates from differentiable probabilistic models, like Bayesian Neural Networks (BNNs). Our method, Counterfactual Latent Uncertainty Explanations (CLUE), indicates how to change an input, while keeping it on the data manifold, such that a BNN becomes more confident about the input's prediction. We validate CLUE through 1) a novel framework for evaluating counterfactual explanations of uncertainty, 2) a series of ablation experiments, and 3) a user study. Our experiments show that CLUE outperforms baselines and enables practitioners to better understand which input patterns are responsible for predictive uncertainty.

READ FULL TEXT

page 3

page 16

page 19

page 25

page 29

12/05/2021

Diverse, Global and Amortised Counterfactual Explanations for Uncertainty Estimates

To interpret uncertainty estimates from differentiable probabilistic mod...
04/13/2021

δ-CLUE: Diverse Sets of Explanations for Uncertainty Estimates

To interpret uncertainty estimates from differentiable probabilistic mod...
11/27/2019

Actionable Interpretability through Optimizable Counterfactual Explanations for Tree Ensembles

Counterfactual explanations help users understand why machine learned mo...
09/22/2022

Counterfactual Explanations Using Optimization With Constraint Learning

Counterfactual explanations embody one of the many interpretability tech...
03/18/2021

Beyond Trivial Counterfactual Explanations with Diverse Valuable Explanations

Explainability for machine learning models has gained considerable atten...
01/29/2022

Counterfactual Plans under Distributional Ambiguity

Counterfactual explanations are attracting significant attention due to ...