DeepAI AI Chat
Log In Sign Up

CoopSubNet: Cooperating Subnetwork for Data-Driven Regularization of Deep Networks under Limited Training Budgets

06/13/2019
by   Riddhish Bhalodia, et al.
THE UNIVERSITY OF UTAH
4

Deep networks are an integral part of the current machine learning paradigm. Their inherent ability to learn complex functional mappings between data and various target variables, while discovering hidden, task-driven features, makes them a powerful technology in a wide variety of applications. Nonetheless, the success of these networks typically relies on the availability of sufficient training data to optimize a large number of free parameters while avoiding overfitting, especially for networks with large capacity. In scenarios with limited training budgets, e.g., supervised tasks with limited labeled samples, several generic and/or task-specific regularization techniques, including data augmentation, have been applied to improve the generalization of deep networks.Typically such regularizations are introduced independently of that data or training scenario, and must therefore be tuned, tested, and modified to meet the needs of a particular network. In this paper, we propose a novel regularization framework that is driven by the population-level statistics of the feature space to be learned. The regularization is in the form of a cooperating subnetwork, which is an auto-encoder architecture attached to the feature space and trained in conjunction with the primary network. We introduce the architecture and training methodology and demonstrate the effectiveness of the proposed cooperative network-based regularization in a variety of tasks and architectures from the literature. Our code is freely available at <https://github.com/riddhishb/CoopSubNet>

READ FULL TEXT

page 6

page 8

05/28/2019

A Hessian Based Complexity Measure for Deep Networks

Deep (neural) networks have been applied productively in a wide range of...
09/26/2019

Implicit Semantic Data Augmentation for Deep Networks

In this paper, we propose a novel implicit semantic data augmentation (I...
01/09/2022

Invariance encoding in sliced-Wasserstein space for image classification with limited training data

Deep convolutional neural networks (CNNs) are broadly considered to be s...
12/20/2014

Discovering Hidden Factors of Variation in Deep Networks

Deep learning has enjoyed a great deal of success because of its ability...
08/29/2018

DADA: Deep Adversarial Data Augmentation for Extremely Low Data Regime Classification

Deep learning has revolutionized the performance of classification, but ...
11/29/2022

LUMix: Improving Mixup by Better Modelling Label Uncertainty

Modern deep networks can be better generalized when trained with noisy s...
11/17/2022

EfficientTrain: Exploring Generalized Curriculum Learning for Training Visual Backbones

The superior performance of modern deep networks usually comes at the pr...