Residual Networks Behave Like Boosting Algorithms

09/25/2019
by   Chapman Siu, et al.
0

We show that Residual Networks (ResNet) is equivalent to boosting feature representation, without any modification to the underlying ResNet training algorithm. A regret bound based on Online Gradient Boosting theory is proved and suggests that ResNet could achieve Online Gradient Boosting regret bounds through neural network architectural changes with the addition of a shrinkage parameter in the identity skip-connections and using residual modules with max-norm bounds. Through this relation between ResNet and Online Boosting, novel feature representation boosting algorithms can be constructed based on altering residual modules. We demonstrate this through proposing decision tree residual modules to construct a new boosted decision tree algorithm and demonstrating generalization error bounds for both approaches; relaxing constraints within BoostResNet algorithm to allow it to be trained in an out-of-core manner. We evaluate convolution ResNet with and without shrinkage modifications to demonstrate its efficacy, and demonstrate that our online boosted decision tree algorithm is comparable to state-of-the-art offline boosted decision tree algorithms without the drawback of offline approaches.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/25/2019

TreeGrad: Transferring Tree Ensembles to Neural Networks

Gradient Boosting Decision Tree (GBDT) are popular machine learning algo...
research
02/25/2018

Functional Gradient Boosting based on Residual Network Perception

Residual Networks (ResNets) have become state-of-the-art models in deep ...
research
05/24/2018

Multi-Level Deep Cascade Trees for Conversion Rate Prediction

Developing effective and efficient recommendation methods is very challe...
research
05/18/2018

Norm-Preservation: Why Residual Networks Can Become Extremely Deep?

Augmenting deep neural networks with skip connections, as introduced in ...
research
06/14/2012

On Local Regret

Online learning aims to perform nearly as well as the best hypothesis in...
research
02/14/2020

Skip Connections Matter: On the Transferability of Adversarial Examples Generated with ResNets

Skip connections are an essential component of current state-of-the-art ...
research
02/16/2019

RES-SE-NET: Boosting Performance of Resnets by Enhancing Bridge-connections

One of the ways to train deep neural networks effectively is to use resi...

Please sign up or login with your details

Forgot password? Click here to reset