CompNet: Neural networks growing via the compact network morphism

04/27/2018
by   Jun Lu, et al.
0

It is often the case that the performance of a neural network can be improved by adding layers. In real-world practices, we always train dozens of neural network architectures in parallel which is a wasteful process. We explored CompNet, in which case we morph a well-trained neural network to a deeper one where network function can be preserved and the added layer is compact. The work of the paper makes two contributions: a). The modified network can converge fast and keep the same functionality so that we do not need to train from scratch again; b). The layer size of the added layer in the neural network is controlled by removing the redundant parameters with sparse optimization. This differs from previous network morphism approaches which tend to add more neurons or channels beyond the actual requirements and result in redundance of the model. The method is illustrated using several neural network structures on different data sets including MNIST and CIFAR10.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/22/2015

Gradual DropIn of Layers to Train Very Deep Neural Networks

We introduce the concept of dynamically growing a neural network during ...
research
06/21/2022

Renormalized Sparse Neural Network Pruning

Large neural networks are heavily over-parameterized. This is done becau...
research
11/18/2015

Net2Net: Accelerating Learning via Knowledge Transfer

We introduce techniques for rapidly transferring the information stored ...
research
05/14/2020

Dynamic Sparse Training: Find Efficient Sparse Network From Scratch With Trainable Masked Layers

We present a novel network pruning algorithm called Dynamic Sparse Train...
research
06/18/2017

Sparse Neural Networks Topologies

We propose Sparse Neural Network architectures that are based on random ...
research
07/02/2021

Subspace Clustering Based Analysis of Neural Networks

Tools to analyze the latent space of deep neural networks provide a step...
research
02/17/2022

When, where, and how to add new neurons to ANNs

Neurogenesis in ANNs is an understudied and difficult problem, even comp...

Please sign up or login with your details

Forgot password? Click here to reset