Self-Competitive Neural Networks

08/22/2020
by   Iman Saberi, et al.
0

Deep Neural Networks (DNNs) have improved the accuracy of classification problems in lots of applications. One of the challenges in training a DNN is its need to be fed by an enriched dataset to increase its accuracy and avoid it suffering from overfitting. One way to improve the generalization of DNNs is to augment the training data with new synthesized adversarial samples. Recently, researchers have worked extensively to propose methods for data augmentation. In this paper, we generate adversarial samples to refine the Domains of Attraction (DoAs) of each class. In this approach, at each stage, we use the model learned by the primary and generated adversarial data (up to that stage) to manipulate the primary data in a way that look complicated to the DNN. The DNN is then retrained using the augmented data and then it again generates adversarial data that are hard to predict for itself. As the DNN tries to improve its accuracy by competing with itself (generating hard samples and then learning them), the technique is called Self-Competitive Neural Network (SCNN). To generate such samples, we pose the problem as an optimization task, where the network weights are fixed and use a gradient descent based method to synthesize adversarial samples that are on the boundary of their true labels and the nearest wrong labels. Our experimental results show that data augmentation using SCNNs can significantly increase the accuracy of the original network. As an example, we can mention improving the accuracy of a CNN trained with 1000 limited training data of MNIST dataset from 94.26

READ FULL TEXT

page 6

page 8

research
03/24/2017

Smart Augmentation - Learning an Optimal Data Augmentation Strategy

A recurring problem faced when training neural networks is that there is...
research
08/16/2021

IADA: Iterative Adversarial Data Augmentation Using Formal Verification and Expert Guidance

Neural networks (NNs) are widely used for classification tasks for their...
research
10/06/2022

Enhancing Code Classification by Mixup-Based Data Augmentation

Recently, deep neural networks (DNNs) have been widely applied in progra...
research
06/10/2023

Revealing Model Biases: Assessing Deep Neural Networks via Recovered Sample Analysis

This paper proposes a straightforward and cost-effective approach to ass...
research
11/24/2022

Tracking Dataset IP Use in Deep Neural Networks

Training highly performant deep neural networks (DNNs) typically require...
research
06/06/2022

Global Mixup: Eliminating Ambiguity with Clustering

Data augmentation with Mixup has been proven an effective method to regu...
research
02/06/2022

Aligning Eyes between Humans and Deep Neural Network through Interactive Attention Alignment

While Deep Neural Networks (DNNs) are deriving the major innovations in ...

Please sign up or login with your details

Forgot password? Click here to reset