Aux-Drop: Handling Haphazard Inputs in Online Learning Using Auxiliary Dropouts

03/09/2023
by   Rohit Agarwal, et al.
0

Many real-world applications based on online learning produce streaming data that is haphazard in nature, i.e., contains missing features, features becoming obsolete in time, the appearance of new features at later points in time and a lack of clarity on the total number of input features. These challenges make it hard to build a learnable system for such applications, and almost no work exists in deep learning that addresses this issue. In this paper, we present Aux-Drop, an auxiliary dropout regularization strategy for online learning that handles the haphazard input features in an effective manner. Aux-Drop adapts the conventional dropout regularization scheme for the haphazard input feature space ensuring that the final output is minimally impacted by the chaotic appearance of such features. It helps to prevent the co-adaptation of especially the auxiliary and base features, as well as reduces the strong dependence of the output on any of the auxiliary inputs of the model. This helps in better learning for scenarios where certain features disappear in time or when new features are to be modeled. The efficacy of Aux-Drop has been demonstrated through extensive numerical experiments on SOTA benchmarking datasets that include Italy Power Demand, HIGGS, SUSY and multiple UCI datasets.

READ FULL TEXT

page 11

page 17

research
08/26/2020

Auxiliary Network: Scalable and agile online learning for dynamic system with inconsistently available inputs

Streaming classification methods assume the number of input features is ...
research
11/09/2017

Analysis of Dropout in Online Learning

Deep learning is the state-of-the-art in fields such as visual object re...
research
06/15/2021

CODA: Constructivism Learning for Instance-Dependent Dropout Architecture Construction

Dropout is attracting intensive research interest in deep learning as an...
research
06/28/2021

R-Drop: Regularized Dropout for Neural Networks

Dropout is a powerful and widely used technique to regularize the traini...
research
12/29/2022

Macro-block dropout for improved regularization in training end-to-end speech recognition models

This paper proposes a new regularization algorithm referred to as macro-...
research
02/05/2020

Dropout Prediction over Weeks in MOOCs via Interpretable Multi-Layer Representation Learning

Massive Open Online Courses (MOOCs) have become popular platforms for on...
research
05/22/2023

Regularization Through Simultaneous Learning: A Case Study for Hop Classification

Overfitting remains a prevalent challenge in deep neural networks, leadi...

Please sign up or login with your details

Forgot password? Click here to reset