Revisiting the Fragility of Influence Functions

03/22/2023
by   Jacob R. Epifano, et al.
0

In the last few years, many works have tried to explain the predictions of deep learning models. Few methods, however, have been proposed to verify the accuracy or faithfulness of these explanations. Recently, influence functions, which is a method that approximates the effect that leave-one-out training has on the loss function, has been shown to be fragile. The proposed reason for their fragility remains unclear. Although previous work suggests the use of regularization to increase robustness, this does not hold in all cases. In this work, we seek to investigate the experiments performed in the prior work in an effort to understand the underlying mechanisms of influence function fragility. First, we verify influence functions using procedures from the literature under conditions where the convexity assumptions of influence functions are met. Then, we relax these assumptions and study the effects of non-convexity by using deeper models and more complex datasets. Here, we analyze the key metrics and procedures that are used to validate influence functions. Our results indicate that the validation procedures may cause the observed fragility.

READ FULL TEXT

page 2

page 5

research
06/25/2020

Influence Functions in Deep Learning Are Fragile

Influence functions approximate the effect of training samples in test-t...
research
05/26/2023

Theoretical and Practical Perspectives on what Influence Functions Do

Influence functions (IF) have been seen as a technique for explaining mo...
research
09/12/2022

If Influence Functions are the Answer, Then What is the Question?

Influence functions efficiently estimate the effect of removing a single...
research
12/03/2020

Using Cross-Loss Influence Functions to Explain Deep Network Representations

As machine learning is increasingly deployed in the real world, it is ev...
research
09/06/2017

Optimal Sub-sampling with Influence Functions

Sub-sampling is a common and often effective method to deal with the com...
research
04/30/2022

Adapting and Evaluating Influence-Estimation Methods for Gradient-Boosted Decision Trees

Influence estimation analyzes how changes to the training data can lead ...
research
11/21/2022

Some Numerical Simulations Based on Dacorogna Example Functions in Favor of Morrey Conjecture

Morrey Conjecture deals with two properties of functions which are known...

Please sign up or login with your details

Forgot password? Click here to reset