FractalNet: Ultra-Deep Neural Networks without Residuals

05/24/2016
by   Gustav Larsson, et al.
0

We introduce a design strategy for neural network macro-architecture based on self-similarity. Repeated application of a simple expansion rule generates deep networks whose structural layouts are precisely truncated fractals. These networks contain interacting subpaths of different lengths, but do not include any pass-through or residual connections; every internal signal is transformed by a filter and nonlinearity before being seen by subsequent layers. In experiments, fractal networks match the excellent performance of standard residual networks on both CIFAR and ImageNet classification tasks, thereby demonstrating that residual representations may not be fundamental to the success of extremely deep convolutional neural networks. Rather, the key may be the ability to transition, during training, from effectively shallow to deep. We note similarities with student-teacher behavior and develop drop-path, a natural extension of dropout, to regularize co-adaptation of subpaths in fractal architectures. Such regularization allows extraction of high-performance fixed-depth subnetworks. Additionally, fractal networks exhibit an anytime property: shallow subnetworks provide a quick answer, while deeper subnetworks, with higher latency, provide a more accurate answer.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/30/2017

CrescendoNet: A Simple Deep Convolutional Neural Network with Ensemble Behavior

We introduce a new deep convolutional neural network, CrescendoNet, by s...
research
04/21/2018

Study of Residual Networks for Image Recognition

Deep neural networks demonstrate to have a high performance on image cla...
research
09/18/2017

Wide and deep volumetric residual networks for volumetric image classification

3D shape models that directly classify objects from 3D information have ...
research
06/01/2018

Tandem Blocks in Deep Convolutional Neural Networks

Due to the success of residual networks (resnets) and related architectu...
research
09/15/2023

Make Deep Networks Shallow Again

Deep neural networks have a good success record and are thus viewed as t...
research
06/24/2018

SSIMLayer: Towards Robust Deep Representation Learning via Nonlinear Structural Similarity

Deeper convolutional neural networks provide more capacity to approximat...

Please sign up or login with your details

Forgot password? Click here to reset