What do Deep Networks Like to Read?

09/10/2019
by   Jonas Pfeiffer, et al.
0

Recent research towards understanding neural networks probes models in a top-down manner, but is only able to identify model tendencies that are known a priori. We propose Susceptibility Identification through Fine-Tuning (SIFT), a novel abstractive method that uncovers a model's preferences without imposing any prior. By fine-tuning an autoencoder with the gradients from a fixed classifier, we are able to extract propensities that characterize different kinds of classifiers in a bottom-up manner. We further leverage the SIFT architecture to rephrase sentences in order to predict the opposing class of the ground truth label, uncovering potential artifacts encoded in the fixed classification model. We evaluate our method on three diverse tasks with four different models. We contrast the propensities of the models as well as reproduce artifacts reported in the literature.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/19/2022

lo-fi: distributed fine-tuning without communication

When fine-tuning large neural networks, it is common to use multiple nod...
research
08/16/2022

Neural network fragile watermarking with no model performance degradation

Deep neural networks are vulnerable to malicious fine-tuning attacks suc...
research
06/10/2023

What Can an Accent Identifier Learn? Probing Phonetic and Prosodic Information in a Wav2vec2-based Accent Identification Model

This study is focused on understanding and quantifying the change in pho...
research
03/22/2018

What do Deep Networks Like to See?

We propose a novel way to measure and understand convolutional neural ne...
research
09/20/2021

Reproducibility Study: Comparing Rewinding and Fine-tuning in Neural Network Pruning

Scope of reproducibility: We are reproducing Comparing Rewinding and Fin...
research
09/09/2019

General Fragment Model for Information Artifacts

The use of semantic descriptions in data intensive domains require a sys...

Please sign up or login with your details

Forgot password? Click here to reset