Understanding the Benefits of Image Augmentations

06/09/2023
by   Matthew Iceland, et al.
0

Image Augmentations are widely used to reduce overfitting in neural networks. However, the explainability of their benefits largely remains a mystery. We study which layers of residual neural networks (ResNets) are most affected by augmentations using Centered Kernel Alignment (CKA). We do so by analyzing models of varying widths and depths, as well as whether their weights are initialized randomly or through transfer learning. We find that the pattern of how the layers are affected depends on the model's depth, and that networks trained with augmentation that use information from two images affect the learned weights significantly more than augmentations that operate on a single image. Deeper layers of ResNets initialized with ImageNet-1K weights and fine-tuned receive more impact from the augmentations than early layers. Understanding the effects of image augmentations on CNNs will have a variety of applications, such as determining how far back one needs to fine-tune a network and which layers should be frozen when implementing layer freezing algorithms.

READ FULL TEXT
research
09/17/2021

Fine-Tuned Transformers Show Clusters of Similar Representations Across Layers

Despite the success of fine-tuning pretrained language encoders like BER...
research
07/19/2022

Similarity of Pre-trained and Fine-tuned Representations

In transfer learning, only the last part of the networks - the so-called...
research
06/18/2020

What Do Neural Networks Learn When Trained With Random Labels?

We study deep neural networks (DNNs) trained on natural image data with ...
research
02/10/2021

Partial transfusion: on the expressive influence of trainable batch norm parameters for transfer learning

Transfer learning from ImageNet is the go-to approach when applying deep...
research
07/10/2021

Anatomy of Domain Shift Impact on U-Net Layers in MRI Segmentation

Domain Adaptation (DA) methods are widely used in medical image segmenta...
research
03/22/2021

Channel Scaling: A Scale-and-Select Approach for Transfer Learning

Transfer learning with pre-trained neural networks is a common strategy ...
research
09/17/2018

Déjà Vu: an empirical evaluation of the memorization properties of ConvNets

Convolutional neural networks memorize part of their training data, whic...

Please sign up or login with your details

Forgot password? Click here to reset