Inference Graphs for CNN Interpretation

10/20/2021
by   Yael Konforti, et al.
0

Convolutional neural networks (CNNs) have achieved superior accuracy in many visual related tasks. However, the inference process through intermediate layers is opaque, making it difficult to interpret such networks or develop trust in their operation. We propose to model the network hidden layers activity using probabilistic models. The activity patterns in layers of interest are modeled as Gaussian mixture models, and transition probabilities between clusters in consecutive modeled layers are estimated. Based on maximum-likelihood considerations, nodes and paths relevant for network prediction are chosen, connected, and visualized as an inference graph. We show that such graphs are useful for understanding the general inference process of a class, as well as explaining decisions the network makes regarding specific images.

READ FULL TEXT

page 2

page 12

page 13

research
08/19/2019

Adaptative Inference Cost With Convolutional Neural Mixture Models

Despite the outstanding performance of convolutional neural networks (CN...
research
02/23/2017

Analyzing Learned Convnet Features with Dirichlet Process Gaussian Mixture Models

Convolutional Neural Networks (Convnets) have achieved good results in a...
research
10/31/2018

Some New Layer Architectures for Graph CNN

While convolutional neural networks (CNNs) have recently made great stri...
research
03/21/2022

A new perspective on probabilistic image modeling

We present the Deep Convolutional Gaussian Mixture Model (DCGMM), a new ...
research
07/18/2017

Transitioning between Convolutional and Fully Connected Layers in Neural Networks

Digital pathology has advanced substantially over the last decade howeve...
research
10/23/2017

Feedback-prop: Convolutional Neural Network Inference under Partial Evidence

In this paper, we propose an inference procedure for deep convolutional ...

Please sign up or login with your details

Forgot password? Click here to reset