Growing Artificial Neural Networks

06/11/2020
by   John Mixter, et al.
0

Pruning is a legitimate method for reducing the size of a neural network to fit in low SWaP hardware, but the networks must be trained and pruned offline. We propose an algorithm, Artificial Neurogenesis (ANG), that grows rather than prunes the network and enables neural networks to be trained and executed in low SWaP embedded hardware. ANG accomplishes this by using the training data to determine critical connections between layers before the actual training takes place. Our experiments use a modified LeNet-5 as a baseline neural network that achieves a test accuracy of 98.74 grown network achieves a test accuracy of 98.80

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/05/2021

Artificial Neural Networks generated by Low Discrepancy Sequences

Artificial neural networks can be represented by paths. Generated as ran...
research
06/03/2020

Assessing Intelligence in Artificial Neural Networks

The purpose of this work was to develop of metrics to assess network arc...
research
03/17/2017

Deciding How to Decide: Dynamic Routing in Artificial Neural Networks

We propose and systematically evaluate three strategies for training dyn...
research
02/26/2020

Predicting Neural Network Accuracy from Weights

We study the prediction of the accuracy of a neural network given only i...
research
06/15/2020

Interaction Networks: Using a Reinforcement Learner to train other Machine Learning algorithms

The wiring of neurons in the brain is more flexible than the wiring of c...
research
10/21/2019

Coercing Machine Learning to Output Physically Accurate Results

Many machine/deep learning artificial neural networks are trained to sim...
research
05/11/2020

CupNet – Pruning a network for geometric data

Using data from a simulated cup drawing process, we demonstrate how the ...

Please sign up or login with your details

Forgot password? Click here to reset