PABO: Pseudo Agent-Based Multi-Objective Bayesian Hyperparameter Optimization for Efficient Neural Accelerator Design

06/11/2019
by   Maryam Parsa, et al.
0

The ever increasing computational cost of Deep Neural Networks (DNN) and the demand for energy efficient hardware for DNN acceleration has made accuracy and hardware cost co-optimization for DNNs tremendously important, especially for edge devices. Owing to the large parameter space and cost of evaluating each parameter in the search space, manually tuning of DNN hyperparameters is impractical. Automatic joint DNN and hardware hyperparameter optimization is indispensable for such problems. Bayesian optimization-based approaches have shown promising results for hyperparameter optimization of DNNs. However, most of these techniques have been developed without considering the underlying hardware, thereby leading to inefficient designs. Further, the few works that perform joint optimization are not generalizable and mainly focus on CMOS-based architectures. In this work, we present a novel pseudo agent-based multi-objective hyperparameter optimization (PABO) for maximizing the DNN performance while obtaining low hardware cost. Compared to the existing methods, our work poses a theoretically different approach for joint optimization of accuracy and hardware cost and focuses on memristive crossbar-based accelerators. PABO uses a supervisor agent to establish connections between the posterior Gaussian distribution models of network accuracy and hardware cost requirements. The agent reduces the mathematical complexity of the co-optimization problem by removing unnecessary computations and updates of acquisition functions, thereby achieving significant speed-ups for the optimization procedure. PABO outputs a Pareto frontier that underscores the trade-offs between designing high-accuracy and hardware efficiency. Our results demonstrate a superior performance compared to the state-of-the-art methods both in terms of accuracy and computational speed ( 100x speed up).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/18/2022

Fair and Green Hyperparameter Optimization via Multi-objective and Multiple Information Source Bayesian Optimization

There is a consensus that focusing only on accuracy in searching for opt...
research
04/26/2021

HAO: Hardware-aware neural Architecture Optimization for Efficient Inference

Automatic algorithm-hardware co-design for DNN has shown great success i...
research
06/10/2021

A multi-objective perspective on jointly tuning hardware and hyperparameters

In addition to the best model architecture and hyperparameters, a full A...
research
01/18/2020

FlexiBO: Cost-Aware Multi-Objective Optimization of Deep Neural Networks

One of the key challenges in designing machine learning systems is to de...
research
11/11/2020

DLFusion: An Auto-Tuning Compiler for Layer Fusion on Deep Neural Network Accelerator

Many hardware vendors have introduced specialized deep neural networks (...
research
05/23/2023

Augmented Random Search for Multi-Objective Bayesian Optimization of Neural Networks

Deploying Deep Neural Networks (DNNs) on tiny devices is a common trend ...
research
05/22/2023

HighLight: Efficient and Flexible DNN Acceleration with Hierarchical Structured Sparsity

Due to complex interactions among various deep neural network (DNN) opti...

Please sign up or login with your details

Forgot password? Click here to reset