Brief technical note on linearizing recurrent neural networks (RNNs) before vs after the pointwise nonlinearity

09/07/2023
by   Marino Pagan, et al.
0

Linearization of the dynamics of recurrent neural networks (RNNs) is often used to study their properties. The same RNN dynamics can be written in terms of the “activations" (the net inputs to each unit, before its pointwise nonlinearity) or in terms of the “activities" (the output of each unit, after its pointwise nonlinearity); the two corresponding linearizations are different from each other. This brief and informal technical note describes the relationship between the two linearizations, between the left and right eigenvectors of their dynamics matrices, and shows that some context-dependent effects are readily apparent under linearization of activity dynamics but not linearization of activation dynamics.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/12/2017

Deep Echo State Network (DeepESN): A Brief Survey

The study of deep recurrent neural networks (RNNs) and, in particular, o...
research
08/23/2023

Characterising representation dynamics in recurrent neural networks for object recognition

Recurrent neural networks (RNNs) have yielded promising results for both...
research
02/14/2018

Use of recurrent infomax to improve the memory capability of input-driven recurrent neural networks

The inherent transient dynamics of recurrent neural networks (RNNs) have...
research
05/05/2020

Recurrent Neural Network Learning of Performance and Intrinsic Population Dynamics from Sparse Neural Data

Recurrent Neural Networks (RNNs) are popular models of brain function. T...
research
05/23/2016

Path-Normalized Optimization of Recurrent Neural Networks with ReLU Activations

We investigate the parameter-space geometry of recurrent neural networks...
research
04/13/2016

Bridging the Gaps Between Residual Learning, Recurrent Neural Networks and Visual Cortex

We discuss relations between Residual Networks (ResNet), Recurrent Neura...
research
05/28/2019

Non-normal Recurrent Neural Network (nnRNN): learning long time dependencies while improving expressivity with transient dynamics

A recent strategy to circumvent the exploding and vanishing gradient pro...

Please sign up or login with your details

Forgot password? Click here to reset