Multi-Residual Networks: Improving the Speed and Accuracy of Residual Networks

09/19/2016
by   Masoud Abdi, et al.
0

In this article, we take one step toward understanding the learning behavior of deep residual networks, and supporting the observation that deep residual networks behave like ensembles. We propose a new convolutional neural network architecture which builds upon the success of residual networks by explicitly exploiting the interpretation of very deep networks as an ensemble. The proposed multi-residual network increases the number of residual functions in the residual blocks. Our architecture generates models that are wider, rather than deeper, which significantly improves accuracy. We show that our model achieves an error rate of 3.73 respectively, that outperforms almost all of the existing models. We also demonstrate that our model outperforms very deep residual networks by 0.22 (top-1 error) on the full ImageNet 2012 classification dataset. Additionally, inspired by the parallel structure of multi-residual networks, a model parallelism technique has been investigated. The model parallelism method distributes the computation of residual blocks among the processors, yielding up to 15

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/30/2016

Wider or Deeper: Revisiting the ResNet Model for Visual Recognition

The trend towards increasingly deep neural networks has been driven by a...
research
05/20/2016

Residual Networks Behave Like Ensembles of Relatively Shallow Networks

In this work we propose a novel interpretation of residual networks show...
research
02/11/2019

On Residual Networks Learning a Perturbation from Identity

The purpose of this work is to test and study the hypothesis that residu...
research
06/02/2017

Dynamic Steerable Blocks in Deep Residual Networks

Filters in convolutional networks are typically parameterized in a pixel...
research
11/08/2016

The Loss Surface of Residual Networks: Ensembles and the Role of Batch Normalization

Deep Residual Networks present a premium in performance in comparison to...
research
11/22/2017

BlockDrop: Dynamic Inference Paths in Residual Networks

Very deep convolutional neural networks offer excellent recognition resu...
research
05/04/2023

Stimulative Training++: Go Beyond The Performance Limits of Residual Networks

Residual networks have shown great success and become indispensable in r...

Please sign up or login with your details

Forgot password? Click here to reset