HCE: Improving Performance and Efficiency with Heterogeneously Compressed Neural Network Ensemble

01/18/2023
by   Jingchi Zhang, et al.
0

Ensemble learning has gain attention in resent deep learning research as a way to further boost the accuracy and generalizability of deep neural network (DNN) models. Recent ensemble training method explores different training algorithms or settings on multiple sub-models with the same model architecture, which lead to significant burden on memory and computation cost of the ensemble model. Meanwhile, the heurtsically induced diversity may not lead to significant performance gain. We propose a new prespective on exploring the intrinsic diversity within a model architecture to build efficient DNN ensemble. We make an intriguing observation that pruning and quantization, while both leading to efficient model architecture at the cost of small accuracy drop, leads to distinct behavior in the decision boundary. To this end, we propose Heterogeneously Compressed Ensemble (HCE), where we build an efficient ensemble with the pruned and quantized variants from a pretrained DNN model. An diversity-aware training objective is proposed to further boost the performance of the HCE ensemble. Experiemnt result shows that HCE achieves significant improvement in the efficiency-accuracy tradeoff comparing to both traditional DNN ensemble training methods and previous model compression methods.

READ FULL TEXT

page 3

page 4

research
08/29/2019

Deep Neural Network Ensembles against Deception: Ensemble Diversity, Accuracy and Robustness

Ensemble learning is a methodology that integrates multiple DNN learners...
research
12/26/2021

Efficient Diversity-Driven Ensemble for Deep Neural Networks

The ensemble of deep neural networks has been shown, both theoretically ...
research
10/03/2021

Boost Neural Networks by Checkpoints

Training multiple deep neural networks (DNNs) and averaging their output...
research
06/02/2022

Mask-Guided Divergence Loss Improves the Generalization and Robustness of Deep Neural Network

Deep neural network (DNN) with dropout can be regarded as an ensemble mo...
research
06/02/2016

Ensemble-Compression: A New Method for Parallel Training of Deep Neural Networks

Parallelization framework has become a necessity to speed up the trainin...
research
08/23/2021

On the Acceleration of Deep Neural Network Inference using Quantized Compressed Sensing

Accelerating deep neural network (DNN) inference on resource-limited dev...
research
12/06/2021

Fast Test Input Generation for Finding Deviated Behaviors in Compressed Deep Neural Network

Model compression can significantly reduce sizes of deep neural network ...

Please sign up or login with your details

Forgot password? Click here to reset