Feed-forward approximations to dynamic recurrent network architectures

04/21/2017
by   Dylan Richard Muir, et al.
0

Recurrent neural network architectures can have useful computational properties, with complex temporal dynamics and input-sensitive attractor states. However, evaluation of recurrent dynamic architectures requires solution of systems of differential equations, and the number of evaluations required to determine their response to a given input can vary with the input, or can be indeterminate altogether in the case of oscillations or instability. In feed-forward networks, by contrast, only a single pass through the network is needed to determine the response to a given input. Modern machine-learning systems are designed to operate efficiently on feed-forward architectures. We hypothesised that two-layer feedforward architectures with simple, deterministic dynamics could approximate the responses of single-layer recurrent network architectures. By identifying the fixed-point responses of a given recurrent network, we trained two-layer networks to directly approximate the fixed-point response to a given input. These feed-forward networks then embodied useful computations, including competitive interactions, information transformations and noise rejection. Our approach was able to find useful approximations to recurrent networks, which can then be evaluated in linear and deterministic time complexity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/20/2020

Implicit recurrent networks: A novel approach to stationary input processing with recurrent neural networks in deep learning

The brain cortex, which processes visual, auditory and sensory data in t...
research
10/13/2020

Unfolding recurrence by Green's functions for optimized reservoir computing

Cortical networks are strongly recurrent, and neurons have intrinsic tem...
research
02/27/2020

Deep Randomized Neural Networks

Randomized Neural Networks explore the behavior of neural systems where ...
research
08/25/2021

Opportunistic Emulation of Computationally Expensive Simulations via Deep Learning

With the underlying aim of increasing efficiency of computational modell...
research
03/03/2020

A Metric for Evaluating Neural Input Representation in Supervised Learning Networks

Supervised learning has long been attributed to several feed-forward neu...
research
06/07/2017

Recurrent computations for visual pattern completion

Making inferences from partial information constitutes a critical aspect...
research
10/02/2021

Recurrent circuits as multi-path ensembles for modeling responses of early visual cortical neurons

In this paper, we showed that adding within-layer recurrent connections ...

Please sign up or login with your details

Forgot password? Click here to reset