Network with Sub-Networks

08/02/2019
by   Ninnart Fuengfusin, et al.
0

We introduce network with sub-network, a neural network which their weight layer could be removed into sub-neural networks on demand during inference. This method provides selectivity in the number of weight layer. To develop the parameters which could be used in both base and sub-neural networks models, firstly, the weights and biases are copied from sub-models to base-model. Each model is forwarded separately. Gradients from both networks are averaged and, used to update both networks. From the empirical experiment, our base-model achieves the test-accuracy that is comparable to the regularly trained models, while it maintains the ability to remove the weight layers.

READ FULL TEXT
research
08/08/2022

Understanding Weight Similarity of Neural Networks via Chain Normalization Rule and Hypothesis-Training-Testing

We present a weight similarity measure method that can quantify the weig...
research
10/11/2018

Bayesian neural networks increasingly sparsify their units with depth

We investigate deep Bayesian neural networks with Gaussian priors on the...
research
06/12/2020

How many winning tickets are there in one DNN?

The recent lottery ticket hypothesis proposes that there is one sub-netw...
research
09/20/2021

Dynamic Neural Diversification: Path to Computationally Sustainable Neural Networks

Small neural networks with a constrained number of trainable parameters,...
research
02/14/2022

Orthogonalising gradients to speed up neural network optimisation

The optimisation of neural networks can be sped up by orthogonalising th...
research
06/16/2022

Not All Lotteries Are Made Equal

The Lottery Ticket Hypothesis (LTH) states that for a reasonably sized n...
research
03/10/2022

Transition to Linearity of Wide Neural Networks is an Emerging Property of Assembling Weak Models

Wide neural networks with linear output layer have been shown to be near...

Please sign up or login with your details

Forgot password? Click here to reset