Intra-Ensemble in Neural Networks

04/09/2019
by   Yuan Gao, et al.
0

Improving model performance is always the key problem in machine learning including deep learning. However, stand-alone neural networks always suffer from marginal effect when stacking more layers. At the same time, ensemble is a useful technique to further enhance model performance. Nevertheless, training several independent stand-alone deep neural networks costs multiple resources. In this work, we propose Intra-Ensemble, an end-to-end strategy with stochastic training operations to train several sub-networks simultaneously within one neural network. Additional parameter size is marginal since the majority of parameters are mutually shared. Meanwhile, stochastic training increases the diversity of sub-networks with weight sharing, which significantly enhances intra-ensemble performance. Extensive experiments prove the applicability of intra-ensemble on various kinds of datasets and network architectures. Our models achieve comparable results with the state-of-the-art architectures on CIFAR-10 and CIFAR-100.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/12/2018

Rapid Training of Very Large Ensembles of Diverse Neural Networks

Ensembles of deep neural networks with diverse architectures significant...
research
06/12/2017

Confident Multiple Choice Learning

Ensemble methods are arguably the most trustworthy techniques for boosti...
research
02/10/2023

Gauge-equivariant neural networks as preconditioners in lattice QCD

We demonstrate that a state-of-the art multi-grid preconditioner can be ...
research
10/07/2021

Ensemble Neural Representation Networks

Implicit Neural Representation (INR) has recently attracted considerable...
research
07/25/2020

Economical ensembles with hypernetworks

Averaging the predictions of many independently trained neural networks ...
research
02/12/2023

Autoselection of the Ensemble of Convolutional Neural Networks with Second-Order Cone Programming

Ensemble techniques are frequently encountered in machine learning and e...
research
05/24/2019

EnsembleNet: End-to-End Optimization of Multi-headed Models

Ensembling is a universally useful approach to boost the performance of ...

Please sign up or login with your details

Forgot password? Click here to reset