SAM as an Optimal Relaxation of Bayes

10/04/2022
by   Thomas Möllenhoff, et al.
0

Sharpness-aware minimization (SAM) and related adversarial deep-learning methods can drastically improve generalization, but their underlying mechanisms are not yet fully understood. Here, we establish SAM as a relaxation of the Bayes objective where the expected negative-loss is replaced by the optimal convex lower bound, obtained by using the so-called Fenchel biconjugate. The connection enables a new Adam-like extension of SAM to automatically obtain reasonable uncertainty estimates, while sometimes also improving its accuracy. By connecting adversarial and Bayesian methods, our work opens a new path to robustness.

READ FULL TEXT

page 7

page 20

page 21

research
06/11/2020

On the Tightness of Semidefinite Relaxations for Certifying Robustness to Adversarial Examples

The robustness of a neural network to adversarial examples can be provab...
research
12/03/2021

On the Existence of the Adversarial Bayes Classifier (Extended Version)

Adversarial robustness is a critical property in a variety of modern mac...
research
03/17/2020

Multi-action Offline Policy Learning with Bayesian Optimization

We study an offline multi-action policy learning algorithm based on doub...
research
04/30/2020

Bridging Mode Connectivity in Loss Landscapes and Adversarial Robustness

Mode connectivity provides novel geometric insights on analyzing loss la...
research
10/01/2022

On the tightness of linear relaxation based robustness certification methods

There has been a rapid development and interest in adversarial training ...
research
12/10/2019

On Certifying Robust Models by Polyhedral Envelope

Certifying neural networks enables one to offer guarantees on a model's ...
research
05/12/2017

Towards a Principled Integration of Multi-Camera Re-Identification and Tracking through Optimal Bayes Filters

With the rise of end-to-end learning through deep learning, person detec...

Please sign up or login with your details

Forgot password? Click here to reset