DeepAI AI Chat
Log In Sign Up

Neural Nets via Forward State Transformation and Backward Loss Transformation

03/25/2018
by   Bart Jacobs, et al.
National Institute of Informatics
Radboud Universiteit
0

This article studies (multilayer perceptron) neural networks with an emphasis on the transformations involved --- both forward and backward --- in order to develop a semantical/logical perspective that is in line with standard program semantics. The common two-pass neural network training algorithms make this viewpoint particularly fitting. In the forward direction, neural networks act as state transformers. In the reverse direction, however, neural networks change losses of outputs to losses of inputs, thereby acting like a (real-valued) predicate transformer. In this way, backpropagation is functorial by construction, as shown earlier in recent other work. We illustrate this perspective by training a simple instance of a neural network.

READ FULL TEXT

page 1

page 2

page 3

page 4

01/05/2016

DrMAD: Distilling Reverse-Mode Automatic Differentiation for Optimizing Hyperparameters of Deep Neural Networks

The performance of deep neural networks is well-known to be sensitive to...
07/08/2020

BS4NN: Binarized Spiking Neural Networks with Temporal Coding and Learning

We recently proposed the S4NN algorithm, essentially an adaptation of ba...
07/05/2017

Combining Forward and Backward Abstract Interpretation of Horn Clauses

Alternation of forward and backward analyses is a standard technique in ...
07/13/2018

Perceptrons from Memristors

Memristors, resistors with memory whose outputs depend on the history of...
02/10/2023

Forward Learning with Top-Down Feedback: Empirical and Analytical Characterization

"Forward-only" algorithms, which train neural networks while avoiding a ...
06/15/2017

Hardware-efficient on-line learning through pipelined truncated-error backpropagation in binary-state networks

Artificial neural networks (ANNs) trained using backpropagation are powe...
11/08/2016

A backward pass through a CNN using a generative model of its activations

Neural networks have shown to be a practical way of building a very comp...