Identity Connections in Residual Nets Improve Noise Stability

05/27/2019
by   Shuzhi Yu, et al.
0

Residual Neural Networks (ResNets) achieve state-of-the-art performance in many computer vision problems. Compared to plain networks without residual connections (PlnNets), ResNets train faster, generalize better, and suffer less from the so-called degradation problem. We introduce simplified (but still nonlinear) versions of ResNets and PlnNets for which these discrepancies still hold, although to a lesser degree. We establish a 1-1 mapping between simplified ResNets and simplified PlnNets, and show that they are exactly equivalent to each other in expressive power for the same computational complexity. We conjecture that ResNets generalize better because they have better noise stability, and empirically support it for both simplified and fully-fledged networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/21/2021

Maximum and Leaky Maximum Propagation

In this work, we present an alternative to conventional residual connect...
research
08/09/2016

Residual Networks of Residual Networks: Multilevel Residual Networks

A residual-networks family with hundreds or even thousands of layers dom...
research
06/02/2022

Entangled Residual Mappings

Residual mappings have been shown to perform representation learning in ...
research
09/15/2022

BadRes: Reveal the Backdoors through Residual Connection

Generally, residual connections are indispensable network components in ...
research
01/09/2017

Visualizing Residual Networks

Residual networks are the current state of the art on ImageNet. Similar ...
research
03/30/2018

Engineering a Simplified 0-Bit Consistent Weighted Sampling

The Min-Hashing approach to sketching has become an important tool in da...
research
01/07/2020

Kinetic Theory for Residual Neural Networks

Deep residual neural networks (ResNet) are performing very well for many...

Please sign up or login with your details

Forgot password? Click here to reset