Quaternion Backpropagation

12/26/2022
by   Johannes Pöppelbaum, et al.
0

Quaternion valued neural networks experienced rising popularity and interest from researchers in the last years, whereby the derivatives with respect to quaternions needed for optimization are calculated as the sum of the partial derivatives with respect to the real and imaginary parts. However, we can show that product- and chain-rule does not hold with this approach. We solve this by employing the GHRCalculus and derive quaternion backpropagation based on this. Furthermore, we experimentally prove the functionality of the derived quaternion backpropagation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/12/2017

Backpropagation generalized for output derivatives

Backpropagation algorithm is the cornerstone for neural network analysis...
research
06/16/2019

A Closer Look at Double Backpropagation

In recent years, an increasing number of neural network models have incl...
research
11/22/2017

Equivalence of Equilibrium Propagation and Recurrent Backpropagation

Recurrent Backpropagation and Equilibrium Propagation are algorithms for...
research
12/21/2014

SENNS: Sparse Extraction Neural NetworkS for Feature Extraction

By drawing on ideas from optimisation theory, artificial neural networks...
research
02/11/2019

Efficient Computation of High-Order Electromagnetic Field Derivatives for Multiple Design Parameters in FDTD

This paper introduces a new computational framework to derive electromag...
research
02/28/2018

Avoiding overfitting of multilayer perceptrons by training derivatives

Resistance to overfitting is observed for neural networks trained with e...
research
07/04/2012

A Differential Semantics of Lazy AR Propagation

In this paper we present a differential semantics of Lazy AR Propagation...

Please sign up or login with your details

Forgot password? Click here to reset