CODA: Constructivism Learning for Instance-Dependent Dropout Architecture Construction

06/15/2021
by   Xiaoli Li, et al.
0

Dropout is attracting intensive research interest in deep learning as an efficient approach to prevent overfitting. Recently incorporating structural information when deciding which units to drop out produced promising results comparing to methods that ignore the structural information. However, a major issue of the existing work is that it failed to differentiate among instances when constructing the dropout architecture. This can be a significant deficiency for many applications. To solve this issue, we propose Constructivism learning for instance-dependent Dropout Architecture (CODA), which is inspired from a philosophical theory, constructivism learning. Specially, based on the theory we have designed a better drop out technique, Uniform Process Mixture Models, using a Bayesian nonparametric method Uniform process. We have evaluated our proposed method on 5 real-world datasets and compared the performance with other state-of-the-art dropout techniques. The experimental results demonstrated the effectiveness of CODA.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/22/2014

A Bayesian encourages dropout

Dropout is one of the key techniques to prevent the learning from overfi...
research
03/06/2021

Contextual Dropout: An Efficient Sample-Dependent Dropout Module

Dropout has been demonstrated as a simple and effective module to not on...
research
07/26/2017

Reduction of Overfitting in Diabetes Prediction Using Deep Learning Neural Network

Augmented accuracy in prediction of diabetes will open up new frontiers ...
research
03/09/2023

Aux-Drop: Handling Haphazard Inputs in Online Learning Using Auxiliary Dropouts

Many real-world applications based on online learning produce streaming ...
research
12/15/2016

Improving Neural Network Generalization by Combining Parallel Circuits with Dropout

In an attempt to solve the lengthy training times of neural networks, we...
research
10/11/2020

Advanced Dropout: A Model-free Methodology for Bayesian Dropout Optimization

Due to lack of data, overfitting ubiquitously exists in real-world appli...
research
04/05/2022

A Survey on Dropout Methods and Experimental Verification in Recommendation

Overfitting is a common problem in machine learning, which means the mod...

Please sign up or login with your details

Forgot password? Click here to reset