Deephys: Deep Electrophysiology, Debugging Neural Networks under Distribution Shifts

03/17/2023
by   Anirban Sarkar, et al.
0

Deep Neural Networks (DNNs) often fail in out-of-distribution scenarios. In this paper, we introduce a tool to visualize and understand such failures. We draw inspiration from concepts from neural electrophysiology, which are based on inspecting the internal functioning of a neural networks by analyzing the feature tuning and invariances of individual units. Deep Electrophysiology, in short Deephys, provides insights of the DNN's failures in out-of-distribution scenarios by comparative visualization of the neural activity in in-distribution and out-of-distribution datasets. Deephys provides seamless analyses of individual neurons, individual images, and a set of set of images from a category, and it is capable of revealing failures due to the presence of spurious features and novel features. We substantiate the validity of the qualitative visualizations of Deephys thorough quantitative analyses using convolutional and transformers architectures, in several datasets and distribution shifts (namely, colored MNIST, CIFAR-10 and ImageNet).

READ FULL TEXT

page 3

page 4

page 5

page 6

page 7

page 9

page 11

research
04/07/2022

Visualizing Deep Neural Networks with Topographic Activation Maps

Machine Learning with Deep Neural Networks (DNNs) has become a successfu...
research
07/27/2018

Diverse feature visualizations reveal invariances in early layers of deep neural networks

Visualizing features in deep neural networks (DNNs) can help understandi...
research
07/27/2023

Understanding Silent Failures in Medical Image Classification

To ensure the reliable use of classification systems in medical applicat...
research
06/01/2019

A synthetic dataset for deep learning

In this paper, we propose a novel method for generating a synthetic data...
research
11/28/2018

Strike (with) a Pose: Neural Networks Are Easily Fooled by Strange Poses of Familiar Objects

Despite excellent performance on stationary test sets, deep neural netwo...
research
02/11/2018

Influence-Directed Explanations for Deep Convolutional Networks

We study the problem of explaining a rich class of behavioral properties...

Please sign up or login with your details

Forgot password? Click here to reset