Dual Precision Deep Neural Network

09/02/2020
by   Jae-Hyun Park, et al.
7

On-line Precision scalability of the deep neural networks(DNNs) is a critical feature to support accuracy and complexity trade-off during the DNN inference. In this paper, we propose dual-precision DNN that includes two different precision modes in a single model, thereby supporting an on-line precision switch without re-training. The proposed two-phase training process optimizes both low- and high-precision modes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/15/2022

LDP: Learnable Dynamic Precision for Efficient Deep Neural Network Training and Inference

Low precision deep neural network (DNN) training is one of the most effe...
research
09/05/2019

Detecting Deep Neural Network Defects with Data Flow Analysis

Deep neural networks (DNNs) are shown to be promising solutions in many ...
research
07/08/2020

Accuracy of neural networks for the simulation of chaotic dynamics: precision of training data vs precision of the algorithm

We explore the influence of precision of the data and the algorithm for ...
research
04/16/2020

Dual connectivity and standalone modes for LTE-U

Long-Term Evolution in unlicensed bands (LTE-U) has been considered as a...
research
11/23/2017

DNN-Buddies: A Deep Neural Network-Based Estimation Metric for the Jigsaw Puzzle Problem

This paper introduces the first deep neural network-based estimation met...
research
12/15/2020

Gegelati: Lightweight Artificial Intelligence through Generic and Evolvable Tangled Program Graphs

Tangled Program Graph (TPG) is a reinforcement learning technique based ...
research
12/23/2020

Adaptive Precision Training for Resource Constrained Devices

Learn in-situ is a growing trend for Edge AI. Training deep neural netwo...

Please sign up or login with your details

Forgot password? Click here to reset