Examining and Mitigating Kernel Saturation in Convolutional Neural Networks using Negative Images

05/10/2021
by   Nidhi Gowdra, et al.
0

Neural saturation in Deep Neural Networks (DNNs) has been studied extensively, but remains relatively unexplored in Convolutional Neural Networks (CNNs). Understanding and alleviating the effects of convolutional kernel saturation is critical for enhancing CNN models classification accuracies. In this paper, we analyze the effect of convolutional kernel saturation in CNNs and propose a simple data augmentation technique to mitigate saturation and increase classification accuracy, by supplementing negative images to the training dataset. We hypothesize that greater semantic feature information can be extracted using negative images since they have the same structural information as standard images but differ in their data representations. Varied data representations decrease the probability of kernel saturation and thus increase the effectiveness of kernel weight updates. The two datasets selected to evaluate our hypothesis were CIFAR- 10 and STL-10 as they have similar image classes but differ in image resolutions thus making for a better understanding of the saturation phenomenon. MNIST dataset was used to highlight the ineffectiveness of the technique for linearly separable data. The ResNet CNN architecture was chosen since the skip connections in the network ensure the most important features contributing the most to classification accuracy are retained. Our results show that CNNs are indeed susceptible to convolutional kernel saturation and that supplementing negative images to the training dataset can offer a statistically significant increase in classification accuracies when compared against models trained on the original datasets. Our results present accuracy increases of 6.98 CIFAR-10 datasets respectively.

READ FULL TEXT

page 4

page 5

research
11/03/2019

Enhanced Convolutional Neural Tangent Kernels

Recent research shows that for training with ℓ_2 loss, convolutional neu...
research
09/17/2022

A study on the deviations in performance of FNNs and CNNs in the realm of grayscale adversarial images

Neural Networks are prone to having lesser accuracy in the classificatio...
research
06/06/2015

Bayesian Convolutional Neural Networks with Bernoulli Approximate Variational Inference

Convolutional neural networks (CNNs) work well on large datasets. But la...
research
06/12/2014

Convolutional Kernel Networks

An important goal in visual recognition is to devise image representatio...
research
07/27/2021

Dataset Distillation with Infinitely Wide Convolutional Networks

The effectiveness of machine learning algorithms arises from being able ...
research
02/17/2019

Detecting Colorized Images via Convolutional Neural Networks: Toward High Accuracy and Good Generalization

Image colorization achieves more and more realistic results with the inc...

Please sign up or login with your details

Forgot password? Click here to reset