DeepAI AI Chat
Log In Sign Up

Repulsive Deep Ensembles are Bayesian

by   Francesco D'Angelo, et al.

Deep ensembles have recently gained popularity in the deep learning community for their conceptual simplicity and efficiency. However, maintaining functional diversity between ensemble members that are independently trained with gradient descent is challenging. This can lead to pathologies when adding more ensemble members, such as a saturation of the ensemble performance, which converges to the performance of a single model. Moreover, this does not only affect the quality of its predictions, but even more so the uncertainty estimates of the ensemble, and thus its performance on out-of-distribution data. We hypothesize that this limitation can be overcome by discouraging different ensemble members from collapsing to the same function. To this end, we introduce a kernelized repulsive term in the update rule of the deep ensembles. We show that this simple modification not only enforces and maintains diversity among the members but, even more importantly, transforms the maximum a posteriori inference into proper Bayesian inference. Namely, we show that the training dynamics of our proposed repulsive ensembles follow a Wasserstein gradient flow of the KL divergence with the true posterior. We study repulsive terms in weight and function space and empirically compare their performance to standard ensembles and Bayesian baselines on synthetic and real-world prediction tasks.


On Stein Variational Neural Network Ensembles

Ensembles of deep neural networks have achieved great success recently, ...

SAE: Sequential Anchored Ensembles

Computing the Bayesian posterior of a neural network is a challenging ta...

Greedy Bayesian Posterior Approximation with Deep Ensembles

Ensembles of independently trained neural networks are a state-of-the-ar...

Diverse Lottery Tickets Boost Ensemble from a Single Pretrained Model

Ensembling is a popular method used to improve performance as a last res...

DICE: Diversity in Deep Ensembles via Conditional Redundancy Adversarial Estimation

Deep ensembles perform better than a single network thanks to the divers...

Forming Ensembles at Runtime: A Machine Learning Approach

Smart system applications (SSAs) built on top of cyber-physical and soci...

Ensembles of Locally Independent Prediction Models

Many ensemble methods encourage their constituent models to be diverse, ...