Evolving and Understanding Sparse Deep Neural Networks using Cosine Similarity

03/17/2019
by   Joost Pieterse, et al.
0

Training sparse neural networks with adaptive connectivity is an active research topic. Such networks require less storage and have lower computational complexity compared to their dense counterparts. The Sparse Evolutionary Training (SET) procedure uses weights magnitude to evolve efficiently the topology of a sparse network to fit the dataset, while enabling it to have quadratically less parameters than its dense counterpart. To this end, we propose a novel approach that evolves a sparse network topology based on the behavior of neurons in the network. More exactly, the cosine similarities between the activations of any two neurons are used to determine which connections are added to or removed from the network. By integrating our approach within the SET procedure, we propose 5 new algorithms to train sparse neural networks. We argue that our approach has low additional computational complexity and we draw a parallel to Hebbian learning. Experiments are performed on 8 datasets taken from various domains to demonstrate the general applicability of our approach. Even without optimizing hyperparameters for specific datasets, the experiments show that our proposed training algorithms usually outperform SET and state-of-the-art dense neural network techniques. The last but not the least, we show that the evolved connectivity patterns of the input neurons reflect their impact on the classification task.

READ FULL TEXT
research
06/27/2019

On improving deep learning generalization with adaptive sparse connectivity

Large neural networks are very successful in various tasks. However, wit...
research
06/24/2020

Topological Insights in Sparse Neural Networks

Sparse neural networks are effective approaches to reduce the resource r...
research
06/04/2020

Network size and weights size for memorization with two-layers neural networks

In 1988, Eric B. Baum showed that two-layers neural networks with thresh...
research
03/05/2021

Artificial Neural Networks generated by Low Discrepancy Sequences

Artificial neural networks can be represented by paths. Generated as ran...
research
11/25/2019

Rigging the Lottery: Making All Tickets Winners

Sparse neural networks have been shown to be more parameter and compute ...
research
07/10/2019

Sparse Networks from Scratch: Faster Training without Losing Performance

We demonstrate the possibility of what we call sparse learning: accelera...
research
05/21/2019

Evolving neural networks to follow trajectories of arbitrary complexity

Many experiments have been performed that use evolutionary algorithms fo...

Please sign up or login with your details

Forgot password? Click here to reset