Visualizing Residual Networks

01/09/2017
by   Brian Chu, et al.
0

Residual networks are the current state of the art on ImageNet. Similar work in the direction of utilizing shortcut connections has been done extremely recently with derivatives of residual networks and with highway networks. This work potentially challenges our understanding that CNNs learn layers of local features that are followed by increasingly global features. Through qualitative visualization and empirical analysis, we explore the purpose that residual skip connections serve. Our assessments show that the residual shortcut connections force layers to refine features, as expected. We also provide alternate visualizations that confirm that residual networks learn what is already intuitively known about CNNs in general.

READ FULL TEXT

page 3

page 4

page 5

page 6

page 7

research
09/15/2022

BadRes: Reveal the Backdoors through Residual Connection

Generally, residual connections are indispensable network components in ...
research
02/02/2021

Hardware-efficient Residual Networks for FPGAs

Residual networks (ResNets) employ skip connections in their networks – ...
research
12/22/2016

Highway and Residual Networks learn Unrolled Iterative Estimation

The past year saw the introduction of new architectures such as Highway ...
research
10/13/2017

Residual Connections Encourage Iterative Inference

Residual networks (Resnets) have become a prominent architecture in deep...
research
01/22/2021

Solving the Same-Different Task with Convolutional Neural Networks

Deep learning demonstrated major abilities in solving many kinds of diff...
research
05/21/2021

Maximum and Leaky Maximum Propagation

In this work, we present an alternative to conventional residual connect...
research
05/27/2019

Identity Connections in Residual Nets Improve Noise Stability

Residual Neural Networks (ResNets) achieve state-of-the-art performance ...

Please sign up or login with your details

Forgot password? Click here to reset