Reduced Softmax Unit for Deep Neural Network Accelerators

12/28/2021
by   Raghuram S, et al.
0

The Softmax activation layer is a very popular Deep Neural Network (DNN) component when dealing with multi-class prediction problems. However, in DNN accelerator implementations it creates additional complexities due to the need for computation of the exponential for each of its inputs. In this brief we propose a simplified version of the activation unit for accelerators, where only a comparator unit produces the classification result, by choosing the maximum among its inputs. Due to the nature of the activation function, we show that this result is always identical to the classification produced by the Softmax layer.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro