Prune and Tune Ensembles: Low-Cost Ensemble Learning With Sparse Independent Subnetworks

02/23/2022
by   Tim Whitaker, et al.
0

Ensemble Learning is an effective method for improving generalization in machine learning. However, as state-of-the-art neural networks grow larger, the computational cost associated with training several independent networks becomes expensive. We introduce a fast, low-cost method for creating diverse ensembles of neural networks without needing to train multiple models from scratch. We do this by first training a single parent network. We then create child networks by cloning the parent and dramatically pruning the parameters of each child to create an ensemble of members with unique and diverse topologies. We then briefly train each child network for a small number of epochs, which now converge significantly faster when compared to training from scratch. We explore various ways to maximize diversity in the child networks, including the use of anti-random pruning and one-cycle tuning. This diversity enables "Prune and Tune" ensembles to achieve results that are competitive with traditional ensembles at a fraction of the training cost. We benchmark our approach against state of the art low-cost ensemble methods and display marked improvement in both accuracy and uncertainty estimation on CIFAR-10 and CIFAR-100.

READ FULL TEXT
research
09/12/2018

Rapid Training of Very Large Ensembles of Diverse Neural Networks

Ensembles of deep neural networks with diverse architectures significant...
research
02/12/2023

Interpretable Diversity Analysis: Visualizing Feature Representations In Low-Cost Ensembles

Diversity is an important consideration in the construction of robust ne...
research
03/03/2020

Distilled Hierarchical Neural Ensembles with Adaptive Inference Cost

Deep neural networks form the basis of state-of-the-art models across a ...
research
02/17/2020

BatchEnsemble: an Alternative Approach to Efficient Ensemble and Lifelong Learning

Ensembles, where multiple neural networks are trained individually and t...
research
03/01/2018

Learning Sparse Structured Ensembles with SG-MCMC and Network Pruning

An ensemble of neural networks is known to be more robust and accurate t...
research
06/27/2022

Effective training-time stacking for ensembling of deep neural networks

Ensembling is a popular and effective method for improving machine learn...
research
10/03/2021

Boost Neural Networks by Checkpoints

Training multiple deep neural networks (DNNs) and averaging their output...

Please sign up or login with your details

Forgot password? Click here to reset