How Tempering Fixes Data Augmentation in Bayesian Neural Networks

05/27/2022
by   Gregor Bachmann, et al.
0

While Bayesian neural networks (BNNs) provide a sound and principled alternative to standard neural networks, an artificial sharpening of the posterior usually needs to be applied to reach comparable performance. This is in stark contrast to theory, dictating that given an adequate prior and a well-specified model, the untempered Bayesian posterior should achieve optimal performance. Despite the community's extensive efforts, the observed gains in performance still remain disputed with several plausible causes pointing at its origin. While data augmentation has been empirically recognized as one of the main drivers of this effect, a theoretical account of its role, on the other hand, is largely missing. In this work we identify two interlaced factors concurrently influencing the strength of the cold posterior effect, namely the correlated nature of augmentations and the degree of invariance of the employed model to such transformations. By theoretically analyzing simplified settings, we prove that tempering implicitly reduces the misspecification arising from modeling augmentations as i.i.d. data. The temperature mimics the role of the effective sample size, reflecting the gain in information provided by the augmentations. We corroborate our theoretical findings with extensive empirical evaluations, scaling to realistic BNNs. By relying on the framework of group convolutions, we experiment with models of varying inherent degree of invariance, confirming its hypothesized relationship with the optimal temperature.

READ FULL TEXT
research
06/11/2021

Disentangling the Roles of Curation, Data-Augmentation and the Prior in the Cold Posterior Effect

The "cold posterior effect" (CPE) in Bayesian deep learning describes th...
research
06/10/2021

Data augmentation in Bayesian neural networks and the cold posterior effect

Data augmentation is a highly effective approach for improving performan...
research
06/11/2019

Learning robust visual representations using data augmentation invariance

Deep convolutional neural networks trained for image object categorizati...
research
03/30/2022

On Uncertainty, Tempering, and Data Augmentation in Bayesian Classification

Aleatoric uncertainty captures the inherent randomness of the data, such...
research
08/13/2020

A statistical theory of cold posteriors in deep neural networks

To get Bayesian neural networks to perform comparably to standard neural...
research
06/09/2021

Grounding inductive biases in natural images:invariance stems from variations in data

To perform well on unseen and potentially out-of-distribution samples, i...

Please sign up or login with your details

Forgot password? Click here to reset