DeepAI AI Chat
Log In Sign Up

HPTQ: Hardware-Friendly Post Training Quantization

09/19/2021
by   Hai Victor Habi, et al.
Sony
11

Neural network quantization enables the deployment of models on edge devices. An essential requirement for their hardware efficiency is that the quantizers are hardware-friendly: uniform, symmetric, and with power-of-two thresholds. To the best of our knowledge, current post-training quantization methods do not support all of these constraints simultaneously. In this work, we introduce a hardware-friendly post training quantization (HPTQ) framework, which addresses this problem by synergistically combining several known quantization methods. We perform a large-scale study on four tasks: classification, object detection, semantic segmentation and pose estimation over a wide variety of network architectures. Our extensive experiments show that competitive results can be obtained under hardware-friendly constraints.

READ FULL TEXT
07/20/2020

HMQ: Hardware Friendly Mixed Precision Quantization Block for CNNs

Recent work in network quantization produced state-of-the-art results us...
12/01/2021

Hardware-friendly Deep Learning by Network Quantization and Binarization

Quantization is emerging as an efficient approach to promote hardware-fr...
03/27/2021

Automated Backend-Aware Post-Training Quantization

Quantization is a key technique to reduce the resource requirement and i...
03/12/2023

Module-Wise Network Quantization for 6D Object Pose Estimation

Many edge applications, such as collaborative robotics and spacecraft re...
08/23/2022

Adaptation of MobileNetV2 for Face Detection on Ultra-Low Power Platform

Designing Deep Neural Networks (DNNs) running on edge hardware remains a...
04/26/2022

RAPQ: Rescuing Accuracy for Power-of-Two Low-bit Post-training Quantization

We introduce a Power-of-Two post-training quantization( PTQ) method for ...
10/07/2022

A Closer Look at Hardware-Friendly Weight Quantization

Quantizing a Deep Neural Network (DNN) model to be used on a custom acce...