A Simple Dynamic Learning Rate Tuning Algorithm For Automated Training of DNNs

10/25/2019
by   Koyel Mukherjee, et al.
0

Training neural networks on image datasets generally require extensive experimentation to find the optimal learning rate regime. Especially, for the cases of adversarial training or for training a newly synthesized model, one would not know the best learning rate regime beforehand. We propose an automated algorithm for determining the learning rate trajectory, that works across datasets and models for both natural and adversarial training, without requiring any dataset/model specific tuning. It is a stand-alone, parameterless, adaptive approach with no computational overhead. We theoretically discuss the algorithm's convergence behavior. We empirically validate our algorithm extensively. Our results show that our proposed approach consistently achieves top-level accuracy compared to SOTA baselines in the literature in natural as well as adversarial training.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/17/2021

An SDE Framework for Adversarial Training, with Convergence and Robustness Analysis

Adversarial training has gained great popularity as one of the most effe...
research
12/25/2020

A Simple Fine-tuning Is All You Need: Towards Robust Deep Learning Via Adversarial Fine-tuning

Adversarial Training (AT) with Projected Gradient Descent (PGD) is an ef...
research
03/26/2019

Improving image classifiers for small datasets by learning rate adaptations

Our paper introduces an efficient combination of established techniques ...
research
03/25/2020

Auto-Ensemble: An Adaptive Learning Rate Scheduling based Deep Learning Model Ensembling

Ensembling deep learning models is a shortcut to promote its implementat...
research
05/24/2022

Structured Prompt Tuning

We propose structured prompt tuning, a simple and effective method to im...
research
11/18/2019

Feedback Control for Online Training of Neural Networks

Convolutional neural networks (CNNs) are commonly used for image classif...
research
05/18/2020

Parsimonious Computing: A Minority Training Regime for Effective Prediction in Large Microarray Expression Data Sets

Rigorous mathematical investigation of learning rates used in back-propa...

Please sign up or login with your details

Forgot password? Click here to reset