The Hybrid Bootstrap: A Drop-in Replacement for Dropout

01/22/2018
by   Robert Kosar, et al.
0

Regularization is an important component of predictive model building. The hybrid bootstrap is a regularization technique that functions similarly to dropout except that features are resampled from other training points rather than replaced with zeros. We show that the hybrid bootstrap offers superior performance to dropout. We also present a sampling based technique to simplify hyperparameter choice. Next, we provide an alternative sampling technique for convolutional neural networks. Finally, we demonstrate the efficacy of the hybrid bootstrap on non-image tasks using tree-based models.

READ FULL TEXT

page 3

page 5

page 6

page 8

page 9

page 10

research
12/10/2018

Guided Dropout

Dropout is often used in deep neural networks to prevent over-fitting. C...
research
10/14/2018

On the relationship between Dropout and Equiangular Tight Frames

Dropout is a popular regularization technique in neural networks. Yet, t...
research
10/21/2020

TargetDrop: A Targeted Regularization Method for Convolutional Neural Networks

Dropout regularization has been widely used in deep learning but perform...
research
12/04/2017

Data Dropout in Arbitrary Basis for Deep Network Regularization

An important problem in training deep networks with high capacity is to ...
research
10/01/2020

Neighbourhood Bootstrap for Respondent-Driven Sampling

Respondent-Driven Sampling (RDS) is a form of link-tracing sampling, a s...
research
04/21/2018

Bridgeout: stochastic bridge regularization for deep neural networks

A major challenge in training deep neural networks is overfitting, i.e. ...
research
06/06/2021

Regularization in ResNet with Stochastic Depth

Regularization plays a major role in modern deep learning. From classic ...

Please sign up or login with your details

Forgot password? Click here to reset