Learning to Explain: An Information-Theoretic Perspective on Model Interpretation

02/21/2018
by   Jianbo Chen, et al.
0

We introduce instancewise feature selection as a methodology for model interpretation. Our method is based on learning a function to extract a subset of features that are most informative for each given example. This feature selector is trained to maximize the mutual information between selected features and the response variable, where the conditional distribution of the response variable given the input is the model to be explained. We develop an efficient variational approximation to the mutual information, and show that the resulting method compares favorably to other model explanation methods on a variety of synthetic and real data sets using both quantitative metrics and human evaluation.

READ FULL TEXT
research
06/05/2023

Estimating Conditional Mutual Information for Dynamic Feature Selection

Dynamic feature selection, where we sequentially query features to make ...
research
06/09/2020

Adversarial Infidelity Learning for Model Interpretation

Model interpretation is essential in data mining and knowledge discovery...
research
10/30/2020

Information-theoretic Feature Selection via Tensor Decomposition and Submodularity

Feature selection by maximizing high-order mutual information between th...
research
04/26/2021

Instance-wise Causal Feature Selection for Model Interpretation

We formulate a causal extension to the recently introduced paradigm of i...
research
07/04/2017

Kernel Feature Selection via Conditional Covariance Minimization

We propose a framework for feature selection that employs kernel-based m...
research
11/29/2018

Simple stopping criteria for information theoretic feature selection

Information theoretic feature selection aims to select a smallest featur...
research
10/09/2018

Deep supervised feature selection using Stochastic Gates

In this study, we propose a novel non-parametric embedded feature select...

Please sign up or login with your details

Forgot password? Click here to reset