Noise-adding Methods of Saliency Map as Series of Higher Order Partial Derivative

06/08/2018
by   Junghoon Seo, et al.
0

SmoothGrad and VarGrad are techniques that enhance the empirical quality of standard saliency maps by adding noise to input. However, there were few works that provide a rigorous theoretical interpretation of those methods. We analytically formalize the result of these noise-adding methods. As a result, we observe two interesting results from the existing noise-adding methods. First, SmoothGrad does not make the gradient of the score function smooth. Second, VarGrad is independent of the gradient of the score function. We believe that our findings provide a clue to reveal the relationship between local explanation methods of deep neural networks and higher-order partial derivatives of the score function.

READ FULL TEXT
research
02/13/2019

Why are Saliency Maps Noisy? Cause of and Solution to Noisy Saliency Maps

Saliency Map, the gradient of the score function with respect to the inp...
research
05/28/2019

Certifiably Robust Interpretation in Deep Learning

Although gradient-based saliency maps are popular methods for deep learn...
research
06/12/2017

SmoothGrad: removing noise by adding noise

Explaining the output of a deep network remains a challenge. In the case...
research
11/02/2017

The (Un)reliability of saliency methods

Saliency methods aim to explain the predictions of deep neural networks....
research
03/24/2023

IDGI: A Framework to Eliminate Explanation Noise from Integrated Gradients

Integrated Gradients (IG) as well as its variants are well-known techniq...
research
02/01/2019

Understanding Impacts of High-Order Loss Approximations and Features in Deep Learning Interpretation

Current methods to interpret deep learning models by generating saliency...
research
10/07/2021

Learning Higher-Order Dynamics in Video-Based Cardiac Measurement

Computer vision methods typically optimize for first-order dynamics (e.g...

Please sign up or login with your details

Forgot password? Click here to reset