A Rate-Distortion Framework for Explaining Neural Network Decisions

05/27/2019
by   Jan Macdonald, et al.
0

We formalise the widespread idea of interpreting neural network decisions as an explicit optimisation problem in a rate-distortion framework. A set of input features is deemed relevant for a classification decision if the expected classifier score remains nearly constant when randomising the remaining features. We discuss the computational complexity of finding small sets of relevant features and show that the problem is complete for NP^PP, an important class of computational problems frequently arising in AI tasks. Furthermore, we show that it even remains NP-hard to only approximate the optimal solution to within any non-trivial approximation factor. Finally, we consider a continuous problem relaxation and develop a heuristic solution strategy based on assumed density filtering for deep ReLU neural networks. We present numerical experiments for two image classification data sets where we outperform established methods in particular for sparse explanations of neural network decisions.

READ FULL TEXT
research
12/08/2021

The Complexity of the Hausdorff Distance

We investigate the computational complexity of computing the Hausdorff d...
research
05/22/2019

The Computational Complexity of Understanding Network Decisions

For a Boolean function Φ{0,1}^d→{0,1} and an assignment to its variables...
research
06/07/2020

The Continuous Joint Replenishment Problem is Strongly NP-Hard

The Continuous Periodic Joint Replenishment Problem (CPJRP) has been one...
research
07/11/2020

On the complexity of binary polynomial optimization over acyclic hypergraphs

In this work we consider binary polynomial optimization, which is the pr...
research
10/09/2018

The Computational Complexity of Training ReLU(s)

We consider the computational complexity of training depth-2 neural netw...
research
04/25/2018

Towards Fast Computation of Certified Robustness for ReLU Networks

Verifying the robustness property of a general Rectified Linear Unit (Re...
research
01/05/2022

A Theoretically Novel Trade-off for Sparse Secret-key Generation

We in this paper theoretically go over a rate-distortion based sparse di...

Please sign up or login with your details

Forgot password? Click here to reset