Training highly effective connectivities within neural networks with randomly initialized, fixed weights

06/30/2020
by   Cristian Ivan, et al.
0

We present some novel, straightforward methods for training the connection graph of a randomly initialized neural network without training the weights. These methods do not use hyperparameters defining cutoff thresholds and therefore remove the need for iteratively searching optimal values of such hyperparameters. We can achieve similar or higher performances than in the case of training all weights, with a similar computational cost as for standard training techniques. Besides switching connections on and off, we introduce a novel way of training a network by flipping the signs of the weights. If we try to minimize the number of changed connections, by changing less than 10% of the total it is already possible to reach more than 90% of the accuracy achieved by standard training. We obtain good results even with weights of constant magnitude or even when weights are drawn from highly asymmetric distributions. These results shed light on the over-parameterization of neural networks and on how they may be reduced to their effective size.

READ FULL TEXT

page 3

page 4

page 5

page 6

page 7

page 8

page 9

page 12

research
11/29/2019

What's Hidden in a Randomly Weighted Neural Network?

Training a neural network is synonymous with learning the values of the ...
research
07/08/2020

RicciNets: Curvature-guided Pruning of High-performance Neural Networks Using Ricci Flow

A novel method to identify salient computational paths within randomly w...
research
06/08/2015

Learning both Weights and Connections for Efficient Neural Networks

Neural networks are both computationally intensive and memory intensive,...
research
10/02/2018

Simultaneously Optimizing Weight and Quantizer of Ternary Neural Network using Truncated Gaussian Approximation

In the past years, Deep convolution neural network has achieved great su...
research
01/16/2021

Slot Machines: Discovering Winning Combinations of Random Weights in Neural Networks

In contrast to traditional weight optimization in a continuous space, we...
research
03/10/2022

Deep Regression Ensembles

We introduce a methodology for designing and training deep neural networ...
research
07/03/2018

Domain Aware Markov Logic Networks

Combining logic and probability has been a long standing goal of AI. Mar...

Please sign up or login with your details

Forgot password? Click here to reset