Maximum and Leaky Maximum Propagation

05/21/2021
by   Wolfgang Fuhl, et al.
0

In this work, we present an alternative to conventional residual connections, which is inspired by maxout nets. This means that instead of the addition in residual connections, our approach only propagates the maximum value or, in the leaky formulation, propagates a percentage of both. In our evaluation, we show on different public data sets that the presented approaches are comparable to the residual connections and have other interesting properties, such as better generalization with a constant batch normalization, faster learning, and also the possibility to generalize without additional activation functions. In addition, the proposed approaches work very well if ensembles together with residual networks are formed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/18/2021

Recurrent Rational Networks

Latest insights from biology show that intelligence does not only emerge...
research
05/27/2019

Identity Connections in Residual Nets Improve Noise Stability

Residual Neural Networks (ResNets) achieve state-of-the-art performance ...
research
10/05/2022

Dynamical Isometry for Residual Networks

The training success, training speed and generalization ability of neura...
research
11/08/2016

The Loss Surface of Residual Networks: Ensembles and the Role of Batch Normalization

Deep Residual Networks present a premium in performance in comparison to...
research
01/09/2017

Visualizing Residual Networks

Residual networks are the current state of the art on ImageNet. Similar ...
research
07/27/2023

Fading memory as inductive bias in residual recurrent networks

Residual connections have been proposed as architecture-based inductive ...
research
04/23/2020

YOLOv4: Optimal Speed and Accuracy of Object Detection

There are a huge number of features which are said to improve Convolutio...

Please sign up or login with your details

Forgot password? Click here to reset