SADT: Combining Sharpness-Aware Minimization with Self-Distillation for Improved Model Generalization

11/01/2022
by   Masud An Nur Islam Fahim, et al.
0

Methods for improving deep neural network training times and model generalizability consist of various data augmentation, regularization, and optimization approaches, which tend to be sensitive to hyperparameter settings and make reproducibility more challenging. This work jointly considers two recent training strategies that address model generalizability: sharpness-aware minimization, and self-distillation, and proposes the novel training strategy of Sharpness-Aware Distilled Teachers (SADT). The experimental section of this work shows that SADT consistently outperforms previously published training strategies in model convergence time, test-time performance, and model generalizability over various neural architectures, datasets, and hyperparameter settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/14/2019

Rethinking Data Augmentation: Self-Supervision and Self-Distillation

Data augmentation techniques, e.g., flipping or cropping, which systemat...
research
10/22/2020

Learning Loss for Test-Time Augmentation

Data augmentation has been actively studied for robust neural networks. ...
research
04/27/2023

Self-discipline on multiple channels

Self-distillation relies on its own information to improve the generaliz...
research
03/17/2023

TeSLA: Test-Time Self-Learning With Automatic Adversarial Augmentation

Most recent test-time adaptation methods focus on only classification ta...
research
09/03/2022

Training Strategies for Improved Lip-reading

Several training strategies and temporal models have been recently propo...
research
10/07/2021

Efficient Sharpness-aware Minimization for Improved Training of Neural Networks

Overparametrized Deep Neural Networks (DNNs) often achieve astounding pe...
research
11/10/2022

How Does Sharpness-Aware Minimization Minimize Sharpness?

Sharpness-Aware Minimization (SAM) is a highly effective regularization ...

Please sign up or login with your details

Forgot password? Click here to reset