Layer-wise Relevance Propagation for Neural Networks with Local Renormalization Layers

04/04/2016
by   Alexander Binder, et al.
0

Layer-wise relevance propagation is a framework which allows to decompose the prediction of a deep neural network computed over a sample, e.g. an image, down to relevance scores for the single input dimensions of the sample such as subpixels of an image. While this approach can be applied directly to generalized linear mappings, product type non-linearities are not covered. This paper proposes an approach to extend layer-wise relevance propagation to neural networks with local renormalization layers, which is a very common product-type non-linearity in convolutional neural networks. We evaluate the proposed method for local renormalization layers on the CIFAR-10, Imagenet and MIT Places datasets.

READ FULL TEXT
research
02/24/2020

Breaking Batch Normalization for better explainability of Deep Neural Networks through Layer-wise Relevance Propagation

The lack of transparency of neural networks stays a major break for thei...
research
07/17/2018

Layer-wise Relevance Propagation for Explainable Recommendations

In this paper, we tackle the problem of explanations in a deep-learning ...
research
03/21/2016

Controlling Explanatory Heatmap Resolution and Semantics via Decomposition Depth

We present an application of the Layer-wise Relevance Propagation (LRP) ...
research
10/15/2020

Why Layer-Wise Learning is Hard to Scale-up and a Possible Solution via Accelerated Downsampling

Layer-wise learning, as an alternative to global back-propagation, is ea...
research
10/22/2019

Towards best practice in explaining neural network decisions with LRP

Within the last decade, neural network based predictors have demonstrate...
research
02/23/2023

Fact or Artifact? Revise Layer-wise Relevance Propagation on various ANN Architectures

Layer-wise relevance propagation (LRP) is a widely used and powerful tec...
research
04/27/2020

Interpretation of Deep Temporal Representations by Selective Visualization of Internally Activated Units

Recently deep neural networks demonstrate competitive performances in cl...

Please sign up or login with your details

Forgot password? Click here to reset