Skip Connections Eliminate Singularities

01/31/2017
by   A. Emin Orhan, et al.
0

Skip connections made the training of very deep networks possible and have become an indispensable component in a variety of neural architectures. A completely satisfactory explanation for their success remains elusive. Here, we present a novel explanation for the benefits of skip connections in training very deep networks. The difficulty of training deep networks is partly due to the singularities caused by the non-identifiability of the model. Two such singularities have been identified in previous work: (i) overlap singularities caused by the permutation symmetry of nodes in a given layer and (ii) elimination singularities corresponding to the elimination, i.e. consistent deactivation, of nodes. These singularities cause degenerate manifolds in the loss landscape previously shown to slow down learning. We argue that skip connections eliminate these singularities by breaking the permutation symmetry of nodes and by reducing the possibility of node elimination. Moreover, for typical initializations, skip connections move the network away from the "ghosts" of these singularities and sculpt the landscape around them to alleviate the learning slow-down. These hypotheses are supported by evidence from simplified models, as well as from experiments with fully-connected deep networks trained on CIFAR-10 and CIFAR-100.

READ FULL TEXT

page 2

page 3

page 4

research
06/10/2020

Is the Skip Connection Provable to Reform the Neural Network Loss Landscape?

The residual network is now one of the most effective structures in deep...
research
05/18/2018

Norm-Preservation: Why Residual Networks Can Become Extremely Deep?

Augmenting deep neural networks with skip connections, as introduced in ...
research
01/18/2023

Tailor: Altering Skip Connections for Resource-Efficient Inference

Deep neural networks use skip connections to improve training convergenc...
research
02/28/2017

The Shattered Gradients Problem: If resnets are the answer, then what is the question?

A long-standing obstacle to progress in deep learning is the problem of ...
research
04/07/2019

On The Power of Curriculum Learning in Training Deep Networks

Training neural networks is traditionally done by providing a sequence o...
research
10/11/2016

An Empirical Exploration of Skip Connections for Sequential Tagging

In this paper, we empirically explore the effects of various kinds of sk...
research
10/14/2022

Old can be Gold: Better Gradient Flow can Make Vanilla-GCNs Great Again

Despite the enormous success of Graph Convolutional Networks (GCNs) in m...

Please sign up or login with your details

Forgot password? Click here to reset