A Closer Look at Double Backpropagation

06/16/2019
by   Christian Etmann, et al.
8

In recent years, an increasing number of neural network models have included derivatives with respect to inputs in their loss functions, resulting in so-called double backpropagation for first-order optimization. However, so far no general description of the involved derivatives exists. Here, we cover a wide array of special cases in a very general Hilbert space framework, which allows us to provide optimized backpropagation rules for many real-world scenarios. This includes the reduction of calculations for Frobenius-norm-penalties on Jacobians by roughly a third for locally linear activation functions. Furthermore, we provide a description of the discontinuous loss surface of ReLU networks both in the inputs and the parameters and demonstrate why the discontinuities do not pose a big problem in reality.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/26/2022

Quaternion Backpropagation

Quaternion valued neural networks experienced rising popularity and inte...
research
04/08/2020

The Loss Surfaces of Neural Networks with General Activation Functions

We present results extending the foundational work of Choromanska et al ...
research
12/12/2017

Backpropagation generalized for output derivatives

Backpropagation algorithm is the cornerstone for neural network analysis...
research
05/14/2015

Neural Network with Unbounded Activation Functions is Universal Approximator

This paper presents an investigation of the approximation property of ne...
research
09/29/2021

Double framed moduli spaces of quiver representations

Motivated by problems in the neural networks setting, we study moduli sp...
research
07/07/2019

Towards Robust, Locally Linear Deep Networks

Deep networks realize complex mappings that are often understood by thei...
research
06/23/2021

Numerical influence of ReLU'(0) on backpropagation

In theory, the choice of ReLU'(0) in [0, 1] for a neural network has a n...

Please sign up or login with your details

Forgot password? Click here to reset