A Novel Method for Scalable VLSI Implementation of Hyperbolic Tangent Function

07/27/2020
by   Mahesh Chandra, et al.
0

Hyperbolic tangent and Sigmoid functions are used as non-linear activation units in the artificial and deep neural networks. Since, these networks are computationally expensive, customized accelerators are designed for achieving the required performance at lower cost and power. The activation function and MAC units are the key building blocks of these neural networks. A low complexity and accurate hardware implementation of the activation function is required to meet the performance and area targets of such neural network accelerators. Moreover, a scalable implementation is required as the recent studies show that the DNNs may use different precision in different layers. This paper presents a novel method based on trigonometric expansion properties of the hyperbolic function for hardware implementation which can be easily tuned for different accuracy and precision requirements.

READ FULL TEXT
research
07/13/2020

Hardware Implementation of Hyperbolic Tangent Function using Catmull-Rom Spline Interpolation

Deep neural networks yield the state of the art results in many computer...
research
09/25/2016

Accurate and Efficient Hyperbolic Tangent Activation Function on FPGA using the DCT Interpolation Filter

Implementing an accurate and fast activation function with low cost is a...
research
08/19/2022

An Investigation into Neuromorphic ICs using Memristor-CMOS Hybrid Circuits

The memristance of a memristor depends on the amount of charge flowing t...
research
12/04/2021

On the Implementation of Fixed-point Exponential Function for Machine Learning and Signal Processing Accelerators

The natural exponential function is widely used in modeling many enginee...
research
07/13/2020

Comparative Analysis of Polynomial and Rational Approximations of Hyperbolic Tangent Function for VLSI Implementation

Deep neural networks yield the state-of-the-art results in many computer...
research
06/18/2020

Image classification in frequency domain with 2SReLU: a second harmonics superposition activation function

Deep Convolutional Neural Networks are able to identify complex patterns...
research
09/30/2021

Introducing the DOME Activation Functions

In this paper, we introduce a novel non-linear activation function that ...

Please sign up or login with your details

Forgot password? Click here to reset