TaxoNN: A Light-Weight Accelerator for Deep Neural Network Training

09/06/2020
by   Kossar Pourahmadi, et al.
0

Emerging intelligent embedded devices rely on Deep Neural Networks (DNNs) to be able to interact with the real-world environment. This interaction comes with the ability to retrain DNNs, since environmental conditions change continuously in time. Stochastic Gradient Descent (SGD) is a widely used algorithm to train DNNs by optimizing the parameters over the training data iteratively. In this work, first we present a novel approach to add the training ability to a baseline DNN accelerator (inference only) by splitting the SGD algorithm into simple computational elements. Then, based on this heuristic approach we propose TaxoNN, a light-weight accelerator for DNN training. TaxoNN can easily tune the DNN weights by reusing the hardware resources used in the inference process using a time-multiplexing approach and low-bitwidth units. Our experimental results show that TaxoNN delivers, on average, 0.97% higher misclassification rate compared to a full-precision implementation. Moreover, TaxoNN provides 2.1× power saving and 1.65× area reduction over the state-of-the-art DNN training accelerator.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

12/21/2020

Optimizing Deep Neural Networks through Neuroevolution with Stochastic Gradient Descent

Deep neural networks (DNNs) have achieved remarkable success in computer...
11/01/2020

An Embarrassingly Simple Approach to Training Ternary Weight Networks

Deep neural networks (DNNs) have achieved great successes in various dom...
03/23/2016

Acceleration of Deep Neural Network Training with Resistive Cross-Point Devices

In recent years, deep neural networks (DNN) have demonstrated significan...
04/23/2021

Partitioning sparse deep neural networks for scalable training and inference

The state-of-the-art deep neural networks (DNNs) have significant comput...
12/23/2019

Layerwise Noise Maximisation to Train Low-Energy Deep Neural Networks

Deep neural networks (DNNs) depend on the storage of a large number of p...
06/26/2020

Is SGD a Bayesian sampler? Well, almost

Overparameterised deep neural networks (DNNs) are highly expressive and ...
02/14/2018

Field-Programmable Deep Neural Network (DNN) Learning and Inference accelerator: a concept

An accelerator is a specialized integrated circuit designed to perform s...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.