DeepAI AI Chat
Log In Sign Up

Cortical microcircuits as gated-recurrent neural networks

11/07/2017
by   Rui Ponte Costa, et al.
Google
University of Oxford
0

Cortical circuits exhibit intricate recurrent architectures that are remarkably similar across different brain areas. Such stereotyped structure suggests the existence of common computational principles. However, such principles have remained largely elusive. Inspired by gated-memory networks, namely long short-term memory networks (LSTMs), we introduce a recurrent neural network in which information is gated through inhibitory cells that are subtractive (subLSTM). We propose a natural mapping of subLSTMs onto known canonical excitatory-inhibitory cortical microcircuits. Our empirical evaluation across sequential image classification and language modelling tasks shows that subLSTM units can achieve similar performance to LSTM units. These results suggest that cortical circuits can be optimised to solve complex contextual problems and proposes a novel view on their computational function. Overall our work provides a step towards unifying recurrent networks as used in machine learning with their biological counterparts.

READ FULL TEXT

page 3

page 7

12/11/2014

Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling

In this paper we compare different types of recurrent units in recurrent...
03/16/2018

ORGaNICs: A Theory of Working Memory in Brains and Machines

Working memory is a cognitive process that is responsible for temporaril...
03/12/2018

From Nodes to Networks: Evolving Recurrent Neural Networks

Gated recurrent networks such as those composed of Long Short-Term Memor...
12/28/2017

Multi-timescale memory dynamics in a reinforcement learning network with attention-gated memory

Learning and memory are intertwined in our brain and their relationship ...
03/01/2017

The Statistical Recurrent Unit

Sophisticated gated recurrent neural network architectures like LSTMs an...
10/09/2019

Kernel-Based Approaches for Sequence Modeling: Connections to Neural Methods

We investigate time-dependent data analysis from the perspective of recu...
12/09/2020

Recurrence-free unconstrained handwritten text recognition using gated fully convolutional network

Unconstrained handwritten text recognition is a major step in most docum...