Shapley Decomposition of R-Squared in Machine Learning Models

08/26/2019
by   Nickalus Redell, et al.
0

In this paper we introduce a metric aimed at helping machine learning practitioners quickly summarize and communicate the overall importance of each feature in any black-box machine learning prediction model. Our proposed metric, based on a Shapley-value variance decomposition of the familiar R^2 from classical statistics, is a model-agnostic approach for assessing feature importance that fairly allocates the proportion of model-explained variability in the data to each model feature. This metric has several desirable properties including boundedness at 0 and 1 and a feature-level variance decomposition summing to the overall model R^2. In contrast to related methods for computing feature-level R^2 variance decompositions with linear models, our method makes use of pre-computed Shapley values which effectively shifts the computational burden from iteratively fitting many models to the Shapley values themselves. And with recent advancements in Shapley value calculations for gradient boosted decision trees and neural networks, computing our proposed metric after model training can come with minimal computational overhead. Our implementation is available in the R package shapFlex.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset