The ELM Neuron: an Efficient and Expressive Cortical Neuron Model Can Solve Long-Horizon Tasks

by   Aaron Spieler, et al.

Traditional large-scale neuroscience models and machine learning utilize simplified models of individual neurons, relying on collective activity and properly adjusted connections to perform complex computations. However, each biological cortical neuron is inherently a sophisticated computational device, as corroborated in a recent study where it took a deep artificial neural network with millions of parameters to replicate the input-output relationship of a detailed biophysical model of a cortical pyramidal neuron. We question the necessity for these many parameters and introduce the Expressive Leaky Memory (ELM) neuron, a biologically inspired, computationally expressive, yet efficient model of a cortical neuron. Remarkably, our ELM neuron requires only 8K trainable parameters to match the aforementioned input-output relationship accurately. We find that an accurate model necessitates multiple memory-like hidden states and intricate nonlinear synaptic integration. To assess the computational ramifications of this design, we evaluate the ELM neuron on various tasks with demanding temporal structures, including a sequential version of the CIFAR-10 classification task, the challenging Pathfinder-X task, and a new dataset based on the Spiking Heidelberg Digits dataset. Our ELM neuron outperforms most transformer-based models on the Pathfinder-X task with 77 superior performance compared to classic LSTM models on the variant of the Spiking Heidelberg Digits dataset. These findings indicate a potential for biologically motivated, computationally efficient neuronal models to enhance performance in challenging machine learning tasks.


SIT: A Bionic and Non-Linear Neuron for Spiking Neural Network

Spiking Neural Networks (SNNs) have piqued researchers' interest because...

To update or not to update? Neurons at equilibrium in deep models

Recent advances in deep learning optimization showed that, with some a-p...

Network of Evolvable Neural Units: Evolving to Learn at a Synaptic Level

Although Deep Neural Networks have seen great success in recent years th...

Spiking Machine Intelligence: What we can learn from biology and how spiking Neural Networks can help to improve Machine Learning

Up to now, modern Machine Learning is based on fitting high dimensional ...

Dynamic neuronal networks efficiently achieve classification in robotic interactions with real-world objects

Biological cortical networks are potentially fully recurrent networks wi...

Tensor decomposition of higher-order correlations by nonlinear Hebbian plasticity

Biological synaptic plasticity exhibits nonlinearities that are not acco...

Condition Integration Memory Network: An Interpretation of the Meaning of the Neuronal Design

This document introduces a hypothesized framework on the functional natu...

Please sign up or login with your details

Forgot password? Click here to reset