Constructing Organism Networks from Collaborative Self-Replicators

12/20/2022
by   Steffen Illium, et al.
0

We introduce organism networks, which function like a single neural network but are composed of several neural particle networks; while each particle network fulfils the role of a single weight application within the organism network, it is also trained to self-replicate its own weights. As organism networks feature vastly more parameters than simpler architectures, we perform our initial experiments on an arithmetic task as well as on simplified MNIST-dataset classification as a collective. We observe that individual particle networks tend to specialise in either of the tasks and that the ones fully specialised in the secondary task may be dropped from the network without hindering the computational accuracy of the primary task. This leads to the discovery of a novel pruning-strategy for sparse neural networks

READ FULL TEXT

page 4

page 6

page 7

research
06/21/2022

Renormalized Sparse Neural Network Pruning

Large neural networks are heavily over-parameterized. This is done becau...
research
06/11/2019

Weight Agnostic Neural Networks

Not all neural network architectures are created equal, some perform muc...
research
09/21/2021

Stabilizing Elastic Weight Consolidation method in practical ML tasks and using weight importances for neural network pruning

This paper is devoted to the features of the practical application of El...
research
04/30/2022

Engineering flexible machine learning systems by traversing functionally invariant paths in weight space

Deep neural networks achieve human-like performance on a variety of perc...
research
05/18/2017

Building effective deep neural network architectures one feature at a time

Successful training of convolutional neural networks is often associated...
research
04/03/2023

Self-building Neural Networks

During the first part of life, the brain develops while it learns throug...
research
01/31/2022

Signing the Supermask: Keep, Hide, Invert

The exponential growth in numbers of parameters of neural networks over ...

Please sign up or login with your details

Forgot password? Click here to reset