FuCiTNet: Improving the generalization of deep learning networks by the fusion of learned class-inherent transformations

05/17/2020
by   Manuel Rey-Area, et al.
16

It is widely known that very small datasets produce overfitting in Deep Neural Networks (DNNs), i.e., the network becomes highly biased to the data it has been trained on. This issue is often alleviated using transfer learning, regularization techniques and/or data augmentation. This work presents a new approach, independent but complementary to the previous mentioned techniques, for improving the generalization of DNNs on very small datasets in which the involved classes share many visual features. The proposed methodology, called FuCiTNet (Fusion Class inherent Transformations Network), inspired by GANs, creates as many generators as classes in the problem. Each generator, k, learns the transformations that bring the input image into the k-class domain. We introduce a classification loss in the generators to drive the leaning of specific k-class transformations. Our experiments demonstrate that the proposed transformations improve the generalization of the classification model in three diverse datasets.

READ FULL TEXT

page 12

page 13

page 14

page 15

page 16

research
12/13/2017

The Effectiveness of Data Augmentation in Image Classification using Deep Learning

In this paper, we explore and compare multiple solutions to the problem ...
research
03/26/2018

BAGAN: Data Augmentation with Balancing GAN

Image classification datasets are often imbalanced, characteristic that ...
research
02/03/2019

An Empirical Study on Regularization of Deep Neural Networks by Local Rademacher Complexity

Regularization of Deep Neural Networks (DNNs) for the sake of improving ...
research
10/04/2018

Transfer Incremental Learning using Data Augmentation

Deep learning-based methods have reached state of the art performances, ...
research
11/23/2015

What Happened to My Dog in That Network: Unraveling Top-down Generators in Convolutional Neural Networks

Top-down information plays a central role in human perception, but plays...
research
06/16/2023

SLACK: Stable Learning of Augmentations with Cold-start and KL regularization

Data augmentation is known to improve the generalization capabilities of...
research
06/25/2021

CADDA: Class-wise Automatic Differentiable Data Augmentation for EEG Signals

Data augmentation is a key element of deep learning pipelines, as it inf...

Please sign up or login with your details

Forgot password? Click here to reset