Sparsity through evolutionary pruning prevents neuronal networks from overfitting

11/07/2019
by   Richard C. Gerum, et al.
0

Modern Machine learning techniques take advantage of the exponentially rising calculation power in new generation processor units. Thus, the number of parameters which are trained to resolve complex tasks was highly increased over the last decades. However, still the networks fail - in contrast to our brain - to develop general intelligence in the sense of being able to solve several complex tasks with only one network architecture. This could be the case because the brain is not a randomly initialized neural network, which has to be trained by simply investing a lot of calculation power, but has from birth some fixed hierarchical structure. To make progress in decoding the structural basis of biological neural networks we here chose a bottom-up approach, where we evolutionarily trained small neural networks in performing a maze task. This simple maze task requires dynamical decision making with delayed rewards. We were able to show that during the evolutionary optimization random severance of connections lead to better generalization performance of the networks compared to fully connected networks. We conclude that sparsity is a central property of neural networks and should be considered for modern Machine learning approaches.

READ FULL TEXT
research
08/02/2017

Machine learning for neural decoding

While machine learning tools have been rapidly advancing, the majority o...
research
06/06/2023

ESL-SNNs: An Evolutionary Structure Learning Strategy for Spiking Neural Networks

Spiking neural networks (SNNs) have manifested remarkable advantages in ...
research
05/20/2021

A Probabilistic Approach to Neural Network Pruning

Neural network pruning techniques reduce the number of parameters withou...
research
09/11/2023

Brain-inspired Evolutionary Architectures for Spiking Neural Networks

The complex and unique neural network topology of the human brain formed...
research
06/27/2019

On improving deep learning generalization with adaptive sparse connectivity

Large neural networks are very successful in various tasks. However, wit...
research
01/26/2023

Break It Down: Evidence for Structural Compositionality in Neural Networks

Many tasks can be described as compositions over subroutines. Though mod...
research
06/04/2021

Extreme sparsity gives rise to functional specialization

Modularity of neural networks – both biological and artificial – can be ...

Please sign up or login with your details

Forgot password? Click here to reset