When, where, and how to add new neurons to ANNs

02/17/2022
by   Kaitlin Maile, et al.
0

Neurogenesis in ANNs is an understudied and difficult problem, even compared to other forms of structural learning like pruning. By decomposing it into triggers and initializations, we introduce a framework for studying the various facets of neurogenesis: when, where, and how to add neurons during the learning process. We present the Neural Orthogonality (NORTH*) suite of neurogenesis strategies, combining layer-wise triggers and initializations based on the orthogonality of activations or weights to dynamically grow performant networks that converge to an efficient size. We evaluate our contributions against other recent neurogenesis works with MLPs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/18/2016

NoiseOut: A Simple Way to Prune Neural Networks

Neural networks are usually over-parameterized with significant redundan...
research
09/20/2021

Dynamic Neural Diversification: Path to Computationally Sustainable Neural Networks

Small neural networks with a constrained number of trainable parameters,...
research
07/12/2016

Network Trimming: A Data-Driven Neuron Pruning Approach towards Efficient Deep Architectures

State-of-the-art neural networks are getting deeper and wider. While the...
research
08/19/2023

To prune or not to prune : A chaos-causality approach to principled pruning of dense neural networks

Reducing the size of a neural network (pruning) by removing weights with...
research
05/11/2020

SCAT: Second Chance Autoencoder for Textual Data

We present a k-competitive learning approach for textual autoencoders na...
research
04/27/2018

CompNet: Neural networks growing via the compact network morphism

It is often the case that the performance of a neural network can be imp...
research
06/20/2018

A Review of Network Inference Techniques for Neural Activation Time Series

Studying neural connectivity is considered one of the most promising and...

Please sign up or login with your details

Forgot password? Click here to reset