Explaining black box decisions by Shapley cohort refinement

11/01/2019
by   Masayoshi Mase, et al.
0

We introduce a variable importance measure to explain the importance of individual variables to a decision made by a black box function. Our measure is based on the Shapley value from cooperative game theory. Measures of variable importance usually work by changing the value of one or more variables with the others held fixed and then recomputing the function of interest. That approach is problematic because it can create very unrealistic combinations of predictors that never appear in practice or that were never present when the prediction function was being created. Our cohort refinement Shapley approach measures variable importance without using any data points that were not actually observed.

READ FULL TEXT
research
12/20/2022

A Generalized Variable Importance Metric and Estimator for Black Box Machine Learning Models

The aim of this study is to define importance of predictors for black bo...
research
05/31/2022

Variable importance without impossible data

The most popular methods for measuring importance of the variables in a ...
research
12/06/2022

The Importance of Variable Importance

Variable importance is defined as a measure of each regressor's contribu...
research
08/23/2022

Anomaly Attribution with Likelihood Compensation

This paper addresses the task of explaining anomalous predictions of a b...
research
10/12/2021

A Rate-Distortion Framework for Explaining Black-box Model Decisions

We present the Rate-Distortion Explanation (RDE) framework, a mathematic...
research
11/30/2020

TimeSHAP: Explaining Recurrent Models through Sequence Perturbations

Recurrent neural networks are a standard building block in numerous mach...
research
05/15/2021

Cohort Shapley value for algorithmic fairness

Cohort Shapley value is a model-free method of variable importance groun...

Please sign up or login with your details

Forgot password? Click here to reset