Pre or Post-Softmax Scores in Gradient-based Attribution Methods, What is Best?

06/22/2023
by   Miguel Lerma, et al.
0

Gradient based attribution methods for neural networks working as classifiers use gradients of network scores. Here we discuss the practical differences between using gradients of pre-softmax scores versus post-softmax scores, and their respective advantages and disadvantages.

READ FULL TEXT
research
07/06/2023

A Vulnerability of Attribution Methods Using Pre-Softmax Scores

We discuss a vulnerability involving a category of attribution methods u...
research
11/16/2017

A unified view of gradient-based attribution methods for Deep Neural Networks

Understanding the flow of information in Deep Neural Networks is a chall...
research
05/19/2022

Towards a Theory of Faithfulness: Faithful Explanations of Differentiable Classifiers over Continuous Data

There is broad agreement in the literature that explanation methods shou...
research
07/15/2022

Anomalous behaviour in loss-gradient based interpretability methods

Loss-gradients are used to interpret the decision making process of deep...
research
09/26/2014

Gradient-based Taxis Algorithms for Network Robotics

Finding the physical location of a specific network node is a prototypic...
research
07/18/2023

Gradient strikes back: How filtering out high frequencies improves explanations

Recent years have witnessed an explosion in the development of novel pre...
research
08/16/2021

Escaping the Gradient Vanishing: Periodic Alternatives of Softmax in Attention Mechanism

Softmax is widely used in neural networks for multiclass classification,...

Please sign up or login with your details

Forgot password? Click here to reset