Repulsive Deep Ensembles are Bayesian

06/22/2021
by   Francesco D'Angelo, et al.
0

Deep ensembles have recently gained popularity in the deep learning community for their conceptual simplicity and efficiency. However, maintaining functional diversity between ensemble members that are independently trained with gradient descent is challenging. This can lead to pathologies when adding more ensemble members, such as a saturation of the ensemble performance, which converges to the performance of a single model. Moreover, this does not only affect the quality of its predictions, but even more so the uncertainty estimates of the ensemble, and thus its performance on out-of-distribution data. We hypothesize that this limitation can be overcome by discouraging different ensemble members from collapsing to the same function. To this end, we introduce a kernelized repulsive term in the update rule of the deep ensembles. We show that this simple modification not only enforces and maintains diversity among the members but, even more importantly, transforms the maximum a posteriori inference into proper Bayesian inference. Namely, we show that the training dynamics of our proposed repulsive ensembles follow a Wasserstein gradient flow of the KL divergence with the true posterior. We study repulsive terms in weight and function space and empirically compare their performance to standard ensembles and Bayesian baselines on synthetic and real-world prediction tasks.

READ FULL TEXT
research
06/20/2021

On Stein Variational Neural Network Ensembles

Ensembles of deep neural networks have achieved great success recently, ...
research
12/30/2021

SAE: Sequential Anchored Ensembles

Computing the Bayesian posterior of a neural network is a challenging ta...
research
05/29/2021

Greedy Bayesian Posterior Approximation with Deep Ensembles

Ensembles of independently trained neural networks are a state-of-the-ar...
research
05/24/2022

Diverse Lottery Tickets Boost Ensemble from a Single Pretrained Model

Ensembling is a popular method used to improve performance as a last res...
research
01/14/2021

DICE: Diversity in Deep Ensembles via Conditional Redundancy Adversarial Estimation

Deep ensembles perform better than a single network thanks to the divers...
research
06/05/2023

Input gradient diversity for neural network ensembles

Deep Ensembles (DEs) demonstrate improved accuracy, calibration and robu...
research
09/20/2023

You can have your ensemble and run it too – Deep Ensembles Spread Over Time

Ensembles of independently trained deep neural networks yield uncertaint...

Please sign up or login with your details

Forgot password? Click here to reset