Advanced Dropout: A Model-free Methodology for Bayesian Dropout Optimization

10/11/2020
by   Jiyang Xie, et al.
0

Due to lack of data, overfitting ubiquitously exists in real-world applications of deep neural networks (DNNs). In this paper, we propose advanced dropout, a model-free methodology, to mitigate overfitting and improve the performance of DNNs. The advanced dropout technique applies a model-free and easily implemented distribution with a parametric prior, and adaptively adjusts dropout rate. Specifically, the distribution parameters are optimized by stochastic gradient variational Bayes (SGVB) inference in order to carry out an end-to-end training of DNNs. We evaluate the effectiveness of the advanced dropout against nine dropout techniques on five widely used datasets in computer vision. The advanced dropout outperforms all the referred techniques by 0.83 analyze the effectiveness of each component. Meanwhile, convergence of dropout rate and ability to prevent overfitting are discussed in terms of classification performance. Moreover, we extend the application of the advanced dropout to uncertainty inference and network pruning, and we find that the advanced dropout is superior to the corresponding referred methods. The advanced dropout improves classification accuracies by 4 inference and by 0.2 parameters, respectively.

READ FULL TEXT
research
12/22/2014

A Bayesian encourages dropout

Dropout is one of the key techniques to prevent the learning from overfi...
research
10/05/2022

Revisiting Structured Dropout

Large neural networks are often overparameterised and prone to overfitti...
research
05/23/2019

Multi-Sample Dropout for Accelerated Training and Better Generalization

Dropout is a simple but efficient regularization technique for achieving...
research
03/09/2021

PGD-based advanced nonlinear multiparametric regressions for constructing metamodels at the scarce-data limit

Regressions created from experimental or simulated data enable the const...
research
11/05/2016

Robustly representing inferential uncertainty in deep neural networks through sampling

As deep neural networks (DNNs) are applied to increasingly challenging p...
research
04/05/2022

A Survey on Dropout Methods and Experimental Verification in Recommendation

Overfitting is a common problem in machine learning, which means the mod...
research
06/15/2021

CODA: Constructivism Learning for Instance-Dependent Dropout Architecture Construction

Dropout is attracting intensive research interest in deep learning as an...

Please sign up or login with your details

Forgot password? Click here to reset