DeepAI AI Chat
Log In Sign Up

Randomized Ablation Feature Importance

10/01/2019
by   Luke Merrick, et al.
Fiddler Labs Inc.
0

Given a model f that predicts a target y from a vector of input features x = x_1, x_2, ..., x_M, we seek to measure the importance of each feature with respect to the model's ability to make a good prediction. To this end, we consider how (on average) some measure of goodness or badness of prediction (which we term "loss" ℓ), changes when we hide or ablate each feature from the model. To ablate a feature, we replace its value with another possible value randomly. By averaging over many points and many possible replacements, we measure the importance of a feature on the model's ability to make good predictions. Furthermore, we present statistical measures of uncertainty that quantify how confident we are that the feature importance we measure from our finite dataset and finite number of ablations is close to the theoretical true importance value.

READ FULL TEXT

page 1

page 2

page 3

page 4

07/23/2021

Joint Shapley values: a measure of joint feature importance

The Shapley value is one of the most widely used model-agnostic measures...
04/18/2018

Visualizing the Feature Importance for Black Box Models

In recent years, a large amount of model-agnostic methods to improve the...
06/15/2021

Decomposition of Global Feature Importance into Direct and Associative Components (DEDACT)

Global model-agnostic feature importance measures either quantify whethe...
09/08/2018

Interpreting Neural Networks With Nearest Neighbors

Local model interpretation methods explain individual predictions by ass...
06/28/2018

Evaluating Feature Importance Estimates

Estimating the influence of a given feature to a model prediction is cha...
08/03/2016

A Physical Metaphor to Study Semantic Drift

In accessibility tests for digital preservation, over time we experience...
09/02/2021

Inferring feature importance with uncertainties in high-dimensional data

Estimating feature importance is a significant aspect of explaining data...