Neural Mesh: Introducing a Notion of Space and Conservation of Energy to Neural Networks

07/29/2018
by   Jacob Beck, et al.
4

Neural networks are based on a simplified model of the brain. In this project, we wanted to relax the simplifying assumptions of a traditional neural network by making a model that more closely emulates the low level interactions of neurons. Like in an RNN, our model has a state that persists between time steps, so that the energies of neurons persist. However, unlike an RNN, our state consists of a 2 dimensional matrix, rather than a 1 dimensional vector, thereby introducing a concept of distance to other neurons within the state. In our model, neurons can only fire to adjacent neurons, as in the brain. Like in the brain, we only allow neurons to fire in a time step if they contain enough energy, or excitement. We also enforce a notion of conservation of energy, so that a neuron cannot excite its neighbors more than the excitement it already contained at that time step. Taken together, these two features allow signals in the form of activations to flow around in our network over time, making our neural mesh more closely model signals traveling through the brain the brain. Although our main goal is to design an architecture to more closely emulate the brain in the hope of having a correct internal representation of information by the time we know how to properly train a general intelligence, we did benchmark our neural mash on a specific task. We found that by increasing the runtime of the mesh, we were able to increase its accuracy without increasing the number of parameters.

READ FULL TEXT

page 9

page 10

research
01/05/2019

Rethinking the Artificial Neural Networks: A Mesh of Subnets with a Central Mechanism for Storing and Predicting the Data

The Artificial Neural Networks (ANNs) have been originally designed to f...
research
09/22/2021

Naming Schema for a Human Brain-Scale Neural Network

Deep neural networks have become increasingly large and sparse, allowing...
research
03/08/2023

On the Benefits of Biophysical Synapses

The approximation capability of ANNs and their RNN instantiations, is st...
research
06/03/2021

Rich dynamics caused by known biological brain network features resulting in stateful networks

The mammalian brain could contain dense and sparse network connectivity ...
research
11/04/2020

New Ideas for Brain Modelling 7

This paper further integrates the cognitive model, making it mathematica...
research
04/09/2019

A Feature-Value Network as a Brain Model

This paper suggests a statistical framework for describing the relations...
research
05/13/2020

MPI+OpenMP Tasking Scalability for Multi-Morphology Simulations of the Human Brain

The simulation of the behavior of the human brain is one of the most amb...

Please sign up or login with your details

Forgot password? Click here to reset