DeepAI AI Chat
Log In Sign Up

Packed-Ensembles for Efficient Uncertainty Estimation

by   Olivier Laurent, et al.

Deep Ensembles (DE) are a prominent approach achieving excellent performance on key metrics such as accuracy, calibration, uncertainty estimation, and out-of-distribution detection. However, hardware limitations of real-world systems constrain to smaller ensembles and lower capacity networks, significantly deteriorating their performance and properties. We introduce Packed-Ensembles (PE), a strategy to design and train lightweight structured ensembles by carefully modulating the dimension of their encoding space. We leverage grouped convolutions to parallelize the ensemble into a single common backbone and forward pass to improve training and inference speeds. PE is designed to work under the memory budget of a single standard neural network. Through extensive studies we show that PE faithfully preserve the properties of DE, e.g., diversity, and match their performance in terms of accuracy, calibration, out-of-distribution detection and robustness to distribution shift.


Improving robustness and calibration in ensembles with diversity regularization

Calibration and uncertainty estimation are crucial topics in high-risk e...

Deep Ensembles Work, But Are They Necessary?

Ensembling neural networks is an effective way to increase accuracy, and...

Sequential Bayesian Neural Subnetwork Ensembles

Deep neural network ensembles that appeal to model diversity have been u...

When Ensembling Smaller Models is More Efficient than Single Large Models

Ensembling is a simple and popular technique for boosting evaluation per...

Layer Ensembles

Deep Ensembles, as a type of Bayesian Neural Networks, can be used to es...

Sparse MoEs meet Efficient Ensembles

Machine learning models based on the aggregated outputs of submodels, ei...

Diversity regularization in deep ensembles

Calibrating the confidence of supervised learning models is important fo...