Farkas layers: don't shift the data, fix the geometry

10/04/2019
by   Aram-Alexandre Pooladian, et al.
0

Successfully training deep neural networks often requires either batch normalization, appropriate weight initialization, both of which come with their own challenges. We propose an alternative, geometrically motivated method for training. Using elementary results from linear programming, we introduce Farkas layers: a method that ensures at least one neuron is active at a given layer. Focusing on residual networks with ReLU activation, we empirically demonstrate a significant improvement in training capacity in the absence of batch normalization or methods of initialization across a broad range of network sizes on benchmark datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/07/2021

Variance-Aware Weight Initialization for Point Convolutional Neural Networks

Appropriate weight initialization has been of key importance to successf...
research
02/11/2015

Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift

Training Deep Neural Networks is complicated by the fact that the distri...
research
05/28/2023

On the impact of activation and normalization in obtaining isometric embeddings at initialization

In this paper, we explore the structure of the penultimate Gram matrix i...
research
07/16/2019

Single-bit-per-weight deep convolutional neural networks without batch-normalization layers for embedded systems

Batch-normalization (BN) layers are thought to be an integrally importan...
research
01/21/2021

Characterizing signal propagation to close the performance gap in unnormalized ResNets

Batch Normalization is a key component in almost all state-of-the-art im...
research
03/31/2021

Fast Certified Robust Training via Better Initialization and Shorter Warmup

Recently, bound propagation based certified adversarial defense have bee...
research
04/26/2016

Scale Normalization

One of the difficulties of training deep neural networks is caused by im...

Please sign up or login with your details

Forgot password? Click here to reset