DeepAI AI Chat
Log In Sign Up

Visual Interpretability for Deep Learning: a Survey

02/02/2018
by   Quanshi Zhang, et al.
0

This paper reviews recent studies in emerging directions of understanding neural-network representations and learning neural networks with interpretable/disentangled middle-layer representations. Although deep neural networks have exhibited superior performance in various tasks, the interpretability is always an Achilles' heel of deep neural networks. At present, deep neural networks obtain a high discrimination power at the cost of low interpretability of their black-box representations. We believe that the high model interpretability may help people to break several bottlenecks of deep learning, e.g., learning from very few annotations, learning via human-computer communications at the semantic level, and semantically debugging network representations. In this paper, we focus on convolutional neural networks (CNNs), and we revisit the visualization of CNN representations, methods of diagnosing representations of pre-trained CNNs, approaches for disentangling pre-trained CNN representations, learning of CNNs with disentangled representations, and middle-to-end learning based on model interpretability. Finally, we discuss prospective trends of explainable artificial intelligence.

READ FULL TEXT

page 2

page 3

page 4

page 5

page 6

page 8

page 9

01/21/2019

Unsupervised Learning of Neural Networks to Explain Neural Networks (extended abstract)

This paper presents an unsupervised method to learn a neural network, na...
04/16/2022

Semantic interpretation for convolutional neural networks: What makes a cat a cat?

The interpretability of deep neural networks has attracted increasing at...
12/28/2020

A Survey on Neural Network Interpretability

Along with the great success of deep neural networks, there is also grow...
04/30/2021

Interpretable Semantic Photo Geolocalization

Planet-scale photo geolocalization is the complex task of estimating the...
05/06/2019

Deep Visual City Recognition Visualization

Understanding how cities visually differ from each others is interesting...
11/19/2018

Deeper Interpretability of Deep Networks

Deep Convolutional Neural Networks (CNNs) have been one of the most infl...