A backward pass through a CNN using a generative model of its activations

11/08/2016
by   Huayan Wang, et al.
0

Neural networks have shown to be a practical way of building a very complex mapping between a pre-specified input space and output space. For example, a convolutional neural network (CNN) mapping an image into one of a thousand object labels is approaching human performance in this particular task. However the mapping (neural network) does not automatically lend itself to other forms of queries, for example, to detect/reconstruct object instances, to enforce top-down signal on ambiguous inputs, or to recover object instances from occlusion. One way to address these queries is a backward pass through the network that fuses top-down and bottom-up information. In this paper, we show a way of building such a backward pass by defining a generative model of the neural network's activations. Approximate inference of the model would naturally take the form of a backward pass through the CNN layers, and it addresses the aforementioned queries in a unified framework.

READ FULL TEXT

page 6

page 7

page 8

page 10

page 11

research
07/02/2018

Answering Hindsight Queries with Lifted Dynamic Junction Trees

The lifted dynamic junction tree algorithm (LDJT) efficiently answers fi...
research
06/20/2016

DoReFa-Net: Training Low Bitwidth Convolutional Neural Networks with Low Bitwidth Gradients

We propose DoReFa-Net, a method to train convolutional neural networks t...
research
10/24/2019

Reversible designs for extreme memory cost reduction of CNN training

Training Convolutional Neural Networks (CNN) is a resource intensive tas...
research
03/23/2020

Sample-Specific Output Constraints for Neural Networks

Neural networks reach state-of-the-art performance in a variety of learn...
research
01/23/2019

Backprop with Approximate Activations for Memory-efficient Network Training

Larger and deeper neural network architectures deliver improved accuracy...
research
03/25/2018

Neural Nets via Forward State Transformation and Backward Loss Transformation

This article studies (multilayer perceptron) neural networks with an emp...
research
10/27/2019

Expected Hypothetical Completion Probability

Using high-resolution player tracking data made available by the Nationa...

Please sign up or login with your details

Forgot password? Click here to reset