Model Generalization: A Sharpness Aware Optimization Perspective

08/14/2022
by   Jozef Marus Coldenhoff, et al.
0

Sharpness-Aware Minimization (SAM) and adaptive sharpness-aware minimization (ASAM) aim to improve the model generalization. And in this project, we proposed three experiments to valid their generalization from the sharpness aware perspective. And our experiments show that sharpness aware-based optimization techniques could help to provide models with strong generalization ability. Our experiments also show that ASAM could improve the generalization performance on un-normalized data, but further research is needed to confirm this.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/16/2021

Sharpness-Aware Minimization Improves Language Model Generalization

The allure of superhuman-level capabilities has led to considerable inte...
research
07/20/2023

Flatness-Aware Minimization for Domain Generalization

Domain generalization (DG) seeks to learn robust models that generalize ...
research
10/22/2020

Label-Aware Neural Tangent Kernel: Toward Better Generalization and Local Elasticity

As a popular approach to modeling the dynamics of training overparametri...
research
02/23/2021

ASAM: Adaptive Sharpness-Aware Minimization for Scale-Invariant Learning of Deep Neural Networks

Recently, learning algorithms motivated from sharpness of loss surface a...
research
09/14/2023

Gradient constrained sharpness-aware prompt learning for vision-language models

This paper targets a novel trade-off problem in generalizable prompt lea...
research
11/10/2022

How Does Sharpness-Aware Minimization Minimize Sharpness?

Sharpness-Aware Minimization (SAM) is a highly effective regularization ...
research
10/24/2022

Sharpness-aware Minimization for Worst Case Optimization

Improvement of worst group performance and generalization performance ar...

Please sign up or login with your details

Forgot password? Click here to reset