DeepAI AI Chat
Log In Sign Up

Stable ResNet

10/24/2020
by   Soufiane Hayou, et al.
1

Deep ResNet architectures have achieved state of the art performance on many tasks. While they solve the problem of gradient vanishing, they might suffer from gradient exploding as the depth becomes large (Yang et al. 2017). Moreover, recent results have shown that ResNet might lose expressivity as the depth goes to infinity (Yang et al. 2017, Hayou et al. 2019). To resolve these issues, we introduce a new class of ResNet architectures, called Stable ResNet, that have the property of stabilizing the gradient while ensuring expressivity in the infinite depth limit.

READ FULL TEXT

page 1

page 2

page 3

page 4

12/27/2019

Evolution Strategies Converges to Finite Differences

Since the debut of Evolution Strategies (ES) as a tool for Reinforcement...
02/16/2023

New √(n)-consistent, numerically stable higher-order influence function estimators

Higher-Order Influence Functions (HOIFs) provide a unified theory for co...
06/30/2018

A New Benchmark and Progress Toward Improved Weakly Supervised Learning

Knowledge Matters: Importance of Prior Information for Optimization [7],...
11/03/2016

Demystifying ResNet

The Residual Network (ResNet), proposed in He et al. (2015), utilized sh...
06/06/2021

Regularization in ResNet with Stochastic Depth

Regularization plays a major role in modern deep learning. From classic ...
10/01/2021

ResNet strikes back: An improved training procedure in timm

The influential Residual Networks designed by He et al. remain the gold-...
09/24/2021

The Mirror Langevin Algorithm Converges with Vanishing Bias

The technique of modifying the geometry of a problem from Euclidean to H...