Log In Sign Up

Training independent subnetworks for robust prediction

by   Marton Havasi, et al.

Recent approaches to efficiently ensemble neural networks have shown that strong robustness and uncertainty performance can be achieved with a negligible gain in parameters over the original network. However, these methods still require multiple forward passes for prediction, leading to a significant computational cost. In this work, we show a surprising result: the benefits of using multiple predictions can be achieved `for free' under a single model's forward pass. In particular, we show that, using a multi-input multi-output (MIMO) configuration, one can utilize a single model's capacity to train multiple subnetworks that independently learn the task at hand. By ensembling the predictions made by the subnetworks, we improve model robustness without increasing compute. We observe a significant improvement in negative log-likelihood, accuracy, and calibration error on CIFAR10, CIFAR100, ImageNet, and their out-of-distribution variants compared to previous methods.


Robust Object Detection with Multi-input Multi-output Faster R-CNN

Recent years have seen impressive progress in visual recognition on many...

Depth Uncertainty in Neural Networks

Existing methods for estimating uncertainty in deep learning tend to req...

Learning Neural Network Subspaces

Recent observations have advanced our understanding of the neural networ...

Sequential Bayesian Neural Subnetwork Ensembles

Deep neural network ensembles that appeal to model diversity have been u...

Towards reliable and fair probabilistic predictions: field-aware calibration with neural networks

In machine learning, it is observed that probabilistic predictions somet...

PEP: Parameter Ensembling by Perturbation

Ensembling is now recognized as an effective approach for increasing the...

Lottery Pools: Winning More by Interpolating Tickets without Increasing Training or Inference Cost

Lottery tickets (LTs) is able to discover accurate and sparse subnetwork...