GeneCAI: Genetic Evolution for Acquiring Compact AI

04/08/2020
by   Mojan Javaheripi, et al.
0

In the contemporary big data realm, Deep Neural Networks (DNNs) are evolving towards more complex architectures to achieve higher inference accuracy. Model compression techniques can be leveraged to efficiently deploy such compute-intensive architectures on resource-limited mobile devices. Such methods comprise various hyper-parameters that require per-layer customization to ensure high accuracy. Choosing such hyper-parameters is cumbersome as the pertinent search space grows exponentially with model layers. This paper introduces GeneCAI, a novel optimization method that automatically learns how to tune per-layer compression hyper-parameters. We devise a bijective translation scheme that encodes compressed DNNs to the genotype space. The optimality of each genotype is measured using a multi-objective score based on accuracy and number of floating point operations. We develop customized genetic operations to iteratively evolve the non-dominated solutions towards the optimal Pareto front, thus, capturing the optimal trade-off between model accuracy and complexity. GeneCAI optimization method is highly scalable and can achieve a near-linear performance boost on distributed multi-GPU platforms. Our extensive evaluations demonstrate that GeneCAI outperforms existing rule-based and reinforcement learning methods in DNN compression by finding models that lie on a better accuracy-complexity Pareto curve.

READ FULL TEXT

page 3

page 8

research
11/15/2019

ASCAI: Adaptive Sampling for acquiring Compact AI

This paper introduces ASCAI, a novel adaptive sampling methodology that ...
research
05/23/2023

Augmented Random Search for Multi-Objective Bayesian Optimization of Neural Networks

Deploying Deep Neural Networks (DNNs) on tiny devices is a common trend ...
research
02/05/2019

DVOLVER: Efficient Pareto-Optimal Neural Network Architecture Search

Automatic search of neural network architectures is a standing research ...
research
03/21/2019

Evolving Deep Neural Networks by Multi-objective Particle Swarm Optimization for Image Classification

In recent years, convolutional neural networks (CNNs) have become deeper...
research
11/07/2016

Neural Networks Designing Neural Networks: Multi-Objective Hyper-Parameter Optimization

Artificial neural networks have gone through a recent rise in popularity...
research
08/03/2020

Evolving Multi-Resolution Pooling CNN for Monaural Singing Voice Separation

Monaural Singing Voice Separation (MSVS) is a challenging task and has b...
research
06/14/2021

Neuroevolution-Enhanced Multi-Objective Optimization for Mixed-Precision Quantization

Mixed-precision quantization is a powerful tool to enable memory and com...

Please sign up or login with your details

Forgot password? Click here to reset