TaxoNN: A Light-Weight Accelerator for Deep Neural Network Training

by   Kossar Pourahmadi, et al.

Emerging intelligent embedded devices rely on Deep Neural Networks (DNNs) to be able to interact with the real-world environment. This interaction comes with the ability to retrain DNNs, since environmental conditions change continuously in time. Stochastic Gradient Descent (SGD) is a widely used algorithm to train DNNs by optimizing the parameters over the training data iteratively. In this work, first we present a novel approach to add the training ability to a baseline DNN accelerator (inference only) by splitting the SGD algorithm into simple computational elements. Then, based on this heuristic approach we propose TaxoNN, a light-weight accelerator for DNN training. TaxoNN can easily tune the DNN weights by reusing the hardware resources used in the inference process using a time-multiplexing approach and low-bitwidth units. Our experimental results show that TaxoNN delivers, on average, 0.97% higher misclassification rate compared to a full-precision implementation. Moreover, TaxoNN provides 2.1× power saving and 1.65× area reduction over the state-of-the-art DNN training accelerator.



There are no comments yet.


page 1


Optimizing Deep Neural Networks through Neuroevolution with Stochastic Gradient Descent

Deep neural networks (DNNs) have achieved remarkable success in computer...

An Embarrassingly Simple Approach to Training Ternary Weight Networks

Deep neural networks (DNNs) have achieved great successes in various dom...

Acceleration of Deep Neural Network Training with Resistive Cross-Point Devices

In recent years, deep neural networks (DNN) have demonstrated significan...

Partitioning sparse deep neural networks for scalable training and inference

The state-of-the-art deep neural networks (DNNs) have significant comput...

Layerwise Noise Maximisation to Train Low-Energy Deep Neural Networks

Deep neural networks (DNNs) depend on the storage of a large number of p...

Is SGD a Bayesian sampler? Well, almost

Overparameterised deep neural networks (DNNs) are highly expressive and ...

Field-Programmable Deep Neural Network (DNN) Learning and Inference accelerator: a concept

An accelerator is a specialized integrated circuit designed to perform s...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.