Sequential Bayesian Neural Subnetwork Ensembles

06/01/2022
by   Sanket Jantre, et al.
0

Deep neural network ensembles that appeal to model diversity have been used successfully to improve predictive performance and model robustness in several applications. Whereas, it has recently been shown that sparse subnetworks of dense models can match the performance of their dense counterparts and increase their robustness while effectively decreasing the model complexity. However, most ensembling techniques require multiple parallel and costly evaluations and have been proposed primarily with deterministic models, whereas sparsity induction has been mostly done through ad-hoc pruning. We propose sequential ensembling of dynamic Bayesian neural subnetworks that systematically reduce model complexity through sparsity-inducing priors and generate diverse ensembles in a single forward pass of the model. The ensembling strategy consists of an exploration phase that finds high-performing regions of the parameter space and multiple exploitation phases that effectively exploit the compactness of the sparse model to quickly converge to different minima in the energy landscape corresponding to high-performing subnetworks yielding diverse ensembles. We empirically demonstrate that our proposed approach surpasses the baselines of the dense frequentist and Bayesian ensemble models in prediction accuracy, uncertainty estimation, and out-of-distribution (OoD) robustness on CIFAR10, CIFAR100 datasets, and their out-of-distribution variants: CIFAR10-C, CIFAR100-C induced by corruptions. Furthermore, we found that our approach produced the most diverse ensembles compared to the approaches with a single forward pass and even compared to the approaches with multiple forward passes in some cases.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/17/2022

Packed-Ensembles for Efficient Uncertainty Estimation

Deep Ensembles (DE) are a prominent approach achieving excellent perform...
research
02/14/2022

Deep Ensembles Work, But Are They Necessary?

Ensembling neural networks is an effective way to increase accuracy, and...
research
04/17/2022

Multi-Model Ensemble Optimization

Methodology and optimization algorithms for sparse regression are extend...
research
10/19/2022

Adaptive Neural Network Ensemble Using Frequency Distribution

Neural network (NN) ensembles can reduce large prediction variance of NN...
research
10/13/2020

Training independent subnetworks for robust prediction

Recent approaches to efficiently ensemble neural networks have shown tha...
research
07/14/2022

Blurs Behave Like Ensembles: Spatial Smoothings to Improve Accuracy, Uncertainty, and Robustness

Neural network ensembles, such as Bayesian neural networks (BNNs), have ...
research
05/26/2021

Blurs Make Results Clearer: Spatial Smoothings to Improve Accuracy, Uncertainty, and Robustness

Bayesian neural networks (BNNs) have shown success in the areas of uncer...

Please sign up or login with your details

Forgot password? Click here to reset