DeepAI AI Chat
Log In Sign Up

Neural Nets via Forward State Transformation and Backward Loss Transformation

by   Bart Jacobs, et al.
National Institute of Informatics
Radboud Universiteit

This article studies (multilayer perceptron) neural networks with an emphasis on the transformations involved --- both forward and backward --- in order to develop a semantical/logical perspective that is in line with standard program semantics. The common two-pass neural network training algorithms make this viewpoint particularly fitting. In the forward direction, neural networks act as state transformers. In the reverse direction, however, neural networks change losses of outputs to losses of inputs, thereby acting like a (real-valued) predicate transformer. In this way, backpropagation is functorial by construction, as shown earlier in recent other work. We illustrate this perspective by training a simple instance of a neural network.


page 1

page 2

page 3

page 4


DrMAD: Distilling Reverse-Mode Automatic Differentiation for Optimizing Hyperparameters of Deep Neural Networks

The performance of deep neural networks is well-known to be sensitive to...

BS4NN: Binarized Spiking Neural Networks with Temporal Coding and Learning

We recently proposed the S4NN algorithm, essentially an adaptation of ba...

Combining Forward and Backward Abstract Interpretation of Horn Clauses

Alternation of forward and backward analyses is a standard technique in ...

Perceptrons from Memristors

Memristors, resistors with memory whose outputs depend on the history of...

Forward Learning with Top-Down Feedback: Empirical and Analytical Characterization

"Forward-only" algorithms, which train neural networks while avoiding a ...

Hardware-efficient on-line learning through pipelined truncated-error backpropagation in binary-state networks

Artificial neural networks (ANNs) trained using backpropagation are powe...

A backward pass through a CNN using a generative model of its activations

Neural networks have shown to be a practical way of building a very comp...