Intriguing Properties of Randomly Weighted Networks: Generalizing While Learning Next to Nothing

02/02/2018
by   Amir Rosenfeld, et al.
0

Training deep neural networks results in strong learned representations that show good generalization capabilities. In most cases, training involves iterative modification of all weights inside the network via back-propagation. In Extreme Learning Machines, it has been suggested to set the first layer of a network to fixed random values instead of learning it. In this paper, we propose to take this approach a step further and fix almost all layers of a deep convolutional neural network, allowing only a small portion of the weights to be learned. As our experiments show, fixing even the majority of the parameters of the network often results in performance which is on par with the performance of learning all of them. The implications of this intriguing property of deep neural networks are discussed and we suggest ways to harness it to create more robust representations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/29/2019

What's Hidden in a Randomly Weighted Neural Network?

Training a neural network is synonymous with learning the values of the ...
research
09/25/2018

Non-Iterative Knowledge Fusion in Deep Convolutional Neural Networks

Incorporation of a new knowledge into neural networks with simultaneous ...
research
07/12/2022

A developmental approach for training deep belief networks

Deep belief networks (DBNs) are stochastic neural networks that can extr...
research
02/01/2017

PCA-Initialized Deep Neural Networks Applied To Document Image Analysis

In this paper, we present a novel approach for initializing deep neural ...
research
11/19/2015

Adjustable Bounded Rectifiers: Towards Deep Binary Representations

Binary representation is desirable for its memory efficiency, computatio...
research
05/31/2023

The Tunnel Effect: Building Data Representations in Deep Neural Networks

Deep neural networks are widely known for their remarkable effectiveness...
research
07/02/2020

Beyond Signal Propagation: Is Feature Diversity Necessary in Deep Neural Network Initialization?

Deep neural networks are typically initialized with random weights, with...

Please sign up or login with your details

Forgot password? Click here to reset