Deep Isometric Learning for Visual Recognition

06/30/2020
by   Haozhi Qi, et al.
15

Initialization, normalization, and skip connections are believed to be three indispensable techniques for training very deep convolutional neural networks and obtaining state-of-the-art performance. This paper shows that deep vanilla ConvNets without normalization nor skip connections can also be trained to achieve surprisingly good performance on standard image recognition benchmarks. This is achieved by enforcing the convolution kernels to be near isometric during initialization and training, as well as by using a variant of ReLU that is shifted towards being isometric. Further experiments show that if combined with skip connections, such near isometric networks can achieve performances on par with (for ImageNet) and better than (for COCO) the standard ResNet, even without normalization at all. Our code is available at https://github.com/HaozhiQi/ISONet.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/01/2017

DiracNets: Training Very Deep Neural Networks Without Skip-Connections

Deep neural networks with skip-connections, such as ResNet, show excelle...
research
10/14/2022

Old can be Gold: Better Gradient Flow can Make Vanilla-GCNs Great Again

Despite the enormous success of Graph Convolutional Networks (GCNs) in m...
research
10/26/2021

Revisiting Batch Normalization

Batch normalization (BN) is comprised of a normalization component follo...
research
05/21/2017

Shake-Shake regularization

The method introduced in this paper aims at helping deep learning practi...
research
08/18/2021

Generalizing MLPs With Dropouts, Batch Normalization, and Skip Connections

A multilayer perceptron (MLP) is typically made of multiple fully connec...
research
10/22/2018

Can We Gain More from Orthogonality Regularizations in Training Deep CNNs?

This paper seeks to answer the question: as the (near-) orthogonality of...
research
06/23/2022

Set Norm and Equivariant Skip Connections: Putting the Deep in Deep Sets

Permutation invariant neural networks are a promising tool for making pr...

Please sign up or login with your details

Forgot password? Click here to reset