What augmentations are sensitive to hyper-parameters and why?

11/06/2021
by   Ch Muhammad Awais, et al.
27

We apply augmentations to our dataset to enhance the quality of our predictions and make our final models more resilient to noisy data and domain drifts. Yet the question remains, how are these augmentations going to perform with different hyper-parameters? In this study we evaluate the sensitivity of augmentations with regards to the model's hyper parameters along with their consistency and influence by performing a Local Surrogate (LIME) interpretation on the impact of hyper-parameters when different augmentations are applied to a machine learning model. We have utilized Linear regression coefficients for weighing each augmentation. Our research has proved that there are some augmentations which are highly sensitive to hyper-parameters and others which are more resilient and reliable.

READ FULL TEXT

page 2

page 7

research
07/30/2020

On Hyperparameter Optimization of Machine Learning Algorithms: Theory and Practice

Machine learning algorithms have been used widely in various application...
research
04/11/2018

Evaluating Word Embedding Hyper-Parameters for Similarity and Analogy Tasks

The versatility of word embeddings for various applications is attractin...
research
02/01/2022

Surrogate Gradients Design

Surrogate gradient (SG) training provides the possibility to quickly tra...
research
12/08/2017

Characterizing the hyper-parameter space of LSTM language models for mixed context applications

Applying state of the art deep learning models to novel real world datas...
research
09/25/2019

A Heuristic for Efficient Reduction in Hidden Layer Combinations For Feedforward Neural Networks

In this paper, we describe the hyper-parameter search problem in the fie...

Please sign up or login with your details

Forgot password? Click here to reset