DeepAI AI Chat
Log In Sign Up

CORDIC-based Architecture for Powering Computation in Fixed-Point Arithmetic

05/10/2016
by   Nia Simmonds, et al.
0

We present a fixed point architecture (source VHDL code is provided) for powering computation. The fully customized architecture, based on the expanded hyperbolic CORDIC algorithm, allows for design space exploration to establish trade-offs among design parameters (numerical format, number of iterations), execution time, resource usage and accuracy. We also generate Pareto-optimal realizations in the resource-accuracy space: this approach can produce optimal hardware realizations that simultaneously satisfy resource and accuracy requirements.

READ FULL TEXT
04/06/2021

TENT: Efficient Quantization of Neural Networks on the tiny Edge with Tapered FixEd PoiNT

In this research, we propose a new low-precision framework, TENT, to lev...
04/11/2016

Hardware-oriented Approximation of Convolutional Neural Networks

High computational complexity hinders the widespread usage of Convolutio...
05/22/2018

Deep Learning Inference on Embedded Devices: Fixed-Point vs Posit

Performing the inference step of deep learning in resource constrained e...
02/04/2022

Fixed-Point Code Synthesis For Neural Networks

Over the last few years, neural networks have started penetrating safety...
01/06/2020

Stochastic Rounding: Algorithms and Hardware Accelerator

Algorithms and a hardware accelerator for performing stochastic rounding...
05/31/2019

Isolation-Aware Timing Analysis and Design Space Exploration for Predictable and Composable Many-Core Systems

Composable many-core systems enable the independent development and anal...