DeepAI AI Chat
Log In Sign Up

Repulsive Deep Ensembles are Bayesian

06/22/2021
by   Francesco D'Angelo, et al.
0

Deep ensembles have recently gained popularity in the deep learning community for their conceptual simplicity and efficiency. However, maintaining functional diversity between ensemble members that are independently trained with gradient descent is challenging. This can lead to pathologies when adding more ensemble members, such as a saturation of the ensemble performance, which converges to the performance of a single model. Moreover, this does not only affect the quality of its predictions, but even more so the uncertainty estimates of the ensemble, and thus its performance on out-of-distribution data. We hypothesize that this limitation can be overcome by discouraging different ensemble members from collapsing to the same function. To this end, we introduce a kernelized repulsive term in the update rule of the deep ensembles. We show that this simple modification not only enforces and maintains diversity among the members but, even more importantly, transforms the maximum a posteriori inference into proper Bayesian inference. Namely, we show that the training dynamics of our proposed repulsive ensembles follow a Wasserstein gradient flow of the KL divergence with the true posterior. We study repulsive terms in weight and function space and empirically compare their performance to standard ensembles and Bayesian baselines on synthetic and real-world prediction tasks.

READ FULL TEXT
06/20/2021

On Stein Variational Neural Network Ensembles

Ensembles of deep neural networks have achieved great success recently, ...
12/30/2021

SAE: Sequential Anchored Ensembles

Computing the Bayesian posterior of a neural network is a challenging ta...
05/29/2021

Greedy Bayesian Posterior Approximation with Deep Ensembles

Ensembles of independently trained neural networks are a state-of-the-ar...
05/24/2022

Diverse Lottery Tickets Boost Ensemble from a Single Pretrained Model

Ensembling is a popular method used to improve performance as a last res...
01/14/2021

DICE: Diversity in Deep Ensembles via Conditional Redundancy Adversarial Estimation

Deep ensembles perform better than a single network thanks to the divers...
04/30/2021

Forming Ensembles at Runtime: A Machine Learning Approach

Smart system applications (SSAs) built on top of cyber-physical and soci...
11/04/2019

Ensembles of Locally Independent Prediction Models

Many ensemble methods encourage their constituent models to be diverse, ...